Home
Jobs

3773 Scala Jobs - Page 47

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 6.0 years

0 Lacs

Hyderabad

Work from Office

Naukri logo

Hello Job Seekers.... Greetings of the Day.......... we are Hiring for one of our Product Based Company for Hyderabad Location. Position : Scala Developer. Total Years : 2 to 4 Years Notice Period : Immediate to 7 Days. Position : Permanent (Work from office) Shift Timings : Evening shifts (One way cab drop Facility & Food Facility) JOB DESCRIPTION : SCALA DEVELOPER : Build and maintain distributed backend services in Scala, leveraging ZIO, Akka, and functional programming for real-time personalization and recommendation. Design and manage streaming pipelines using Akka Streams or ZIO Streams Work with large-scale Postgres datasets and multi-system data pipelines involving SQS, SNS, and Kinesis • Partner with data scientists and product engineers to integrate recommendation logic into ad serving systems Occasionally contribute to our Angular front end and Lua-based ad server logic • Deliver production-grade code with strong attention to scalability, observability, and maintainability Requirements Must-Have Experience Production-grade backend development in Scala, with deep understanding of functional programming principles (ZIO, Cats, etc.) Experience with distributed systems, asynchronous processing, and streaming architectures Proficiency in PostgreSQL and working with large-scale structured data Hands-on experience with message queueing and event streaming systems like SQS, SNS, or Kinesis Strong reasoning about user behavior, recommendation algorithms, and lifecycle event tracking Nice to Have • Familiarity with the AWS ecosystem and Infrastructure-as-Code tools (Terraform, CDK, etc.) Experience with Angular and/or Lua Prior work in ad tech, real-time bidding systems. If anyone interested in the above Position, Please share me the updated resume to rakesh.b@yochana.com

Posted 1 week ago

Apply

1.0 years

4 - 6 Lacs

Hyderābād

On-site

- 1+ years of data engineering experience - Experience with SQL - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) Do you want to be a leader in the team that takes Transportation and Retail models to the next generation? Do you have a solid analytical thinking, metrics driven decision making and want to solve problems with solutions that will meet the growing worldwide need? Then Transportation is the team for you. We are looking for top notch Data Engineers to be part of our world class Business Intelligence for Transportation team. • 4-7 years of experience performing quantitative analysis, preferably for an Internet or Technology company • Strong experience in Data Warehouse and Business Intelligence application development • Data Analysis: Understand business processes, logical data models and relational database implementations • Expert knowledge in SQL. Optimize complex queries. • Basic understanding of statistical analysis. Experience in testing design and measurement. • Able to execute research projects, and generate practical results and recommendations • Proven track record of working on complex modular projects, and assuming a leading role in such projects • Highly motivated, self-driven, capable of defining own design and test scenarios • Experience with scripting languages, i.e. Perl, Python etc. preferred • BS/MS degree in Computer Science • Evaluate and implement various big-data technologies and solutions (Redshift, Hive/EMR, Tez, Spark) to optimize processing of extremely large datasets in an accurate and timely fashion. Experience with large scale data processing, data structure optimization and scalability of algorithms a plus Key job responsibilities 1. Responsible for designing, building and maintaining complex data solutions for Amazon's Operations businesses 2. Actively participates in the code review process, design discussions, team planning, operational excellence, and constructively identifies problems and proposes solutions 3. Makes appropriate trade-offs, re-use where possible, and is judicious about introducing dependencies 4. Makes efficient use of resources (e.g., system hardware, data storage, query optimization, AWS infrastructure etc.) 5. Knows about recent advances in distributed systems (e.g., MapReduce, MPP Architectures, External Partitioning) 6. Asks correct questions when data model and requirements are not well defined and comes up with designs which are scalable, maintainable and efficient 7. Makes enhancements that improve team’s data architecture, making it better and easier to maintain (e.g., data auditing solutions, automating, ad-hoc or manual operation steps) 8. Owns the data quality of important datasets and any new changes/enhancements Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

15.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Organizations everywhere struggle under the crushing costs and complexities of “solutions” that promise to simplify their lives. To create a better experience for their customers and employees. To help them grow. Software is a choice that can make or break a business. Create better or worse experiences. Propel or throttle growth. Business software has become a blocker instead of ways to get work done. There’s another option. Freshworks. With a fresh vision for how the world works. At Freshworks, we build uncomplicated service software that delivers exceptional customer and employee experiences. Our enterprise-grade solutions are powerful, yet easy to use, and quick to deliver results. Our people-first approach to AI eliminates friction, making employees more effective and organizations more productive. Over 72,000 companies, including Bridgestone, New Balance, Nucor, S&P Global, and Sony Music, trust Freshworks’ customer experience (CX) and employee experience (EX) software to fuel customer loyalty and service efficiency. And, over 4,500 Freshworks employees make this possible, all around the world. Fresh vision. Real impact. Come build it with us. Job Description As a member of the Data Platform team, you'll be at the forefront of transforming how Freshworks Datalake can harnessed to the fullest in making data-driven decisions Key job responsibilities: Drive the backbone of our data platform by building robust pipelines that turn complex data into actionable insights using AWS, Databricks platform Be a data detective by ensuring our data is clean, accurate, and trustworthy Write clean, efficient code that handles massive amounts of structured and unstructured data Qualifications Proficient in at least one major language (Scala or Python) and Kafka (any variant). Write elegant and maintainable code, and you need to be comfortable with picking up new technologies. Proficient in working with distributed systems and have experience with different distributed processing frameworks that can handle data in batch and near real-time e.g. Spark etc. Experience on working with various AWS services and Databricks to build end-to-end data solutions that bring different systems together This role will follow IST working hours and may require weekend availability for monitoring and support activities as needed. Requires 8–15 years of experience in a related field. Additional Information At Freshworks, we are creating a global workplace that enables everyone to find their true potential, purpose, and passion irrespective of their background, gender, race, sexual orientation, religion and ethnicity. We are committed to providing equal opportunity for all and believe that diversity in the workplace creates a more vibrant, richer work environment that advances the goals of our employees, communities and the business. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Greater Chennai Area

Remote

Linkedin logo

Your work days are brighter here. At Workday, it all began with a conversation over breakfast. When our founders met at a sunny California diner, they came up with an idea to revolutionize the enterprise software market. And when we began to rise, one thing that really set us apart was our culture. A culture which was driven by our value of putting our people first. And ever since, the happiness, development, and contribution of every Workmate is central to who we are. Our Workmates believe a healthy employee-centric, collaborative culture is the essential mix of ingredients for success in business. That’s why we look after our people, communities and the planet while still being profitable. Feel encouraged to shine, however that manifests: you don’t need to hide who you are. You can feel the energy and the passion, it's what makes us unique. Inspired to make a brighter work day for all and transform with us to the next stage of our growth journey? Bring your brightest version of you and have a brighter work day here. About The Team About the Team Are you interested in an exciting new adventure building developer tooling? The Product Developer Tooling organization develops software and tools to support all of Workday Application Development and Testing and is extremely passionate about improving developer productivity. As a Software Engineer in our Tooling organization, you will be at the foundation of Workday’s technology, building software that empowers engineering teams to rapidly develop, test and deliver high quality products. Our team currently serves the almost 3,000 strong Workday development community by providing scalable development and testing tools that are vital to support an efficient continuous delivery platform. We have a work environment that is not driven by external product launches, but instead by the needs of our own development community, which allows us to focus on producing well thought-out solutions that enhance our development environment, automated testing and delivery pipeline. About The Role We are looking for a passionate, experienced, Software Engineer to join us on our mission to help shape the next generation of our Workday Developer Tools! We want someone who will be at the forefront of shaping the development and test lifecycle of the other passionate developers who build our Workday Products. Our team follows a hybrid remote model and is built on collaborative teamwork and trust. We love Slack and Zoom to enable our varied communication models, but also value face-to-face time during the moments that matter to our team. This role is for you if you are... Passionate about technology and building world-class applications and frameworks in a fast-paced, fun, agile work environment! A proficient OO and/or functional programmer, enthusiastic about learning and applying sound architectural principles to build scalable/performant designs Someone who is eager to contribute to the scoping, planning, architecture, design, implementation, testing and delivery of key Product features Enthusiastic about collaborating with peers, engineering managers and senior/principal engineers on the technical designs and implementation of new features Interested in participating in the release planning process by understanding the details of the upcoming features (design, effort, risk, priority, size) Interested in Product quality, testing and functional test methodologies (Unit testing, TDD, BDD, etc) About You Basic Qualifications 5+ years of Object Oriented and/or Functional Design and Programming (Java, Javascript, Ruby, Scala, etc) Experience working with automation, CI/CD or web testing software Proficient with HTTP, REST, SOAP, XML, JSON and other key web frameworks (e.g. React, Angular) Demonstrated ability to deliver on time, working in a fast-paced agile environment Competence in communicating design ideas cohesively using UML or technical presentations Agile Methodologies, Code Reviews, Java, Javascript, Python (Programming Language), Software Development BS/MS in Computer Science or related technical field Other Qualifications Test focused with good TDD / Unit & System Testing, debugging and profiling skills Experienced with common IDE, build & CI/CD tools (e.g. IntelliJ, Git, Gradle, maven, Jenkins, TeamCity, Artifactory) Good code review skills and capacity to both provide and act on constructive feedback Excellent collaboration and communication skills Pursuant to applicable Fair Chance law, Workday will consider for employment qualified applicants with arrest and conviction records. Workday is an Equal Opportunity Employer including individuals with disabilities and protected veterans. Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process! Show more Show less

Posted 1 week ago

Apply

10.0 - 19.0 years

8 - 9 Lacs

Thiruvananthapuram

On-site

10 - 19 Years 10 Openings Trivandrum Role description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning. Experience in data warehouse design and cost improvements. Apply and optimize data models for efficient storage retrieval and processing of large datasets. Communicate and explain design/development aspects to customers. Estimate time and resource requirements for developing/debugging features/components. Participate in RFP responses and solutioning. Mentor team members and guide them in relevant upskilling and certification. Knowledge Examples: Knowledge Examples Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF. Proficient in SQL for analytics and windowing functions. Understanding of data schemas and models. Familiarity with domain-related data. Knowledge of data warehouse optimization techniques. Understanding of data security concepts. Awareness of patterns frameworks and automation practices. Additional Comments: Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Skills scala,Python,Pyspark About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 1 week ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description The Data Engineer will own the data infrastructure for the Reverse Logistics Team which includes collaboration with software development teams to build the data infrastructure and maintain a highly scalable, reliable and efficient data system to support the fast growing business. You will work with analytic tools, can write excellent SQL scripts, optimize performance of SQL queries and can partner with internal customers to answer key business questions. We look for candidates who are self-motivated, flexible, hardworking and who like to have fun. About The Team Reverse Logistics team at Amazon Hyderabad Development Center is an agile team whose charter is to deliver the next generation of Reverse Logistics platform. As a member of this team, your mission will be to design, develop, document and support massively scalable, distributed data warehousing, querying and reporting system. Basic Qualifications 2+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Knowledge of AWS Infrastructure Knowledge of writing and optimizing SQL queries in a business environment with large-scale, complex datasets Strong analytical and problem solving skills. Curious, self-motivated & a self-starter with a ‘can do attitude’. Comfortable working in fast paced dynamic environment Preferred Qualifications Bachelor's degree in a quantitative/technical field such as computer science, engineering, statistics Proven track record of strong interpersonal and communication (verbal and written) skills. Experience developing insights across various areas of customer-related data: financial, product, and marketing Proven problem solving skills, attention to detail, and exceptional organizational skills Ability to deal with ambiguity and competing objectives in a fast paced environment Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing and operations Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A2942481 Show more Show less

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior- Snowflake As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Lead and Architect migration of data analytics environment from Teradata to Snowflake with performance and reliability Develop & deploy big data pipelines in a cloud environment using Snowflake cloud DW ETL design, development and migration of existing on-prem ETL routines to Cloud Service Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams Design and optimize model codes for faster execution Skills And Attributes For Success Hands on developer in the field of data warehousing, ETL Hands on development experience in Snowflake. Experience in Snowflake modelling - roles, schema, databases. Experience in Integrating with third-party tools, ETL, DBT tools Experience in Snowflake advanced concepts like setting up resource monitors and performance tuning would be preferable Applying object-oriented and functional programming styles to real-world Big Data Engineering problems using Java/Scala/Python Develop data pipelines to perform batch and Real - Time/Stream analytics on structured and unstructured data. Data processing patterns, distributed computing and in building applications for real-time and batch analytics. Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing. To qualify for the role, you must have Be a computer science graduate or equivalent with 3 - 7 years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Be a technical expert on all aspects of Snowflake Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology Maintain deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Provide guidance on how to resolve customer-specific technical challenges Ideally, you’ll also have Client management skills What We Look For Minimum 5 years of experience as Architect on Analytics solutions and around 2 years of experience with Snowflake. People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Delhi

On-site

Job requisition ID :: 78129 Date: Jun 4, 2025 Location: Delhi Designation: Consultant Entity: Y our potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Your work profile: As a Analyst/Consultant/Senior Consultant in our T&T Team you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - Design, develop and deploy solutions using different tools, design principles and conventions. Configure robotics processes and objects using core workflow principles in an efficient way; ensure they are easily maintainable and easy to understand. Understand existing processes and facilitate change requirements as part of a structured change control process. Solve day to day issues arising while running robotics processes and provide timely resolutions. Maintain proper documentation for the solutions, test procedures and scenarios during UAT and Production phase. Coordinate with process owners and business to understand the as-is process and design the automation process flow. Desired Qualifications Good hands-on experience in GCP services including Big Query, Cloud Storage, Dataflow, Cloud Datapost, Cloud Composer/Airflow, and IAM. Must have proficient experience in GCP Databases: Bigtable, Spanner, Cloud SQL and Alloy DB Proficiency either in SQL, Python, Java, or Scala for data processing and scripting. Experience in development and test automation processes through the CI/CD pipeline (Git, Jenkins, SonarQube, Artifactory, Docker containers) Experience in orchestrating data processing tasks using tools like Cloud Composer or Apache Airflow. Strong understanding of data modeling, data warehousing and big data processing concepts. Solid understanding and experience of relational database concepts and technologies such as SQL, MySQL, PostgreSQL or Oracle. Design and implement data migration strategies for various database types ( PostgreSQL, Oracle, Alloy DB etc.) Deep understanding of at least 1 Database type with ability to write complex SQLs. Experience with NoSQL databases such as MongoDB, Scylla, Cassandra, or DynamoDB is a plus. Optimize data pipelines for performance and cost-efficiency, adhering to GCP best practices. Implement data quality checks, data validation, and monitoring mechanisms to ensure data accuracy and integrity. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into technical solutions. Ability to work independently and manage multiple priorities effectively. Preferably having expertise in end to end DW implementation. Location and way of working: Base location: Bangalore, Mumbai, Delhi, Pune, Hyderabad This profile involves occasional travelling to client locations. Hybrid is our default way of working. Each domain has customized the hybrid approach to their unique needs. Your role as a Analyst/Consultant/Senior Consultant: We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society. In addition to living our purpose, Analyst/Consultant/Senior Consultant across our organization must strive to be: Inspiring - Leading with integrity to build inclusion and motivation. Committed to creating purpose - Creating a sense of vision and purpose. Agile - Achieving high-quality results through collaboration and Team unity. Skilled at building diverse capability - Developing diverse capabilities for the future. Persuasive / Influencing - Persuading and influencing stakeholders. Collaborating - Partnering to build new solutions. Delivering value - Showing commercial acumen Committed to expanding business - Leveraging new business opportunities. Analytical Acumen - Leveraging data to recommend impactful approach and solutions through the power of analysis and visualization. Effective communication – Must be well abled to have well-structured and well-articulated conversations to achieve win-win possibilities. Engagement Management / Delivery Excellence - Effectively managing engagement(s) to ensure timely and proactive execution as well as course correction for the success of engagement(s) Managing change - Responding to changing environment with resilience Managing Quality & Risk - Delivering high quality results and mitigating risks with utmost integrity and precision Strategic Thinking & Problem Solving - Applying strategic mindset to solve business issues and complex problems. Tech Savvy - Leveraging ethical technology practices to deliver high impact for clients and for Deloitte Empathetic leadership and inclusivity - creating a safe and thriving environment where everyone's valued for who they are, use empathy to understand others to adapt our behaviours and attitudes to become more inclusive. How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.

Posted 1 week ago

Apply

2.0 years

0 Lacs

Delhi

On-site

Company Description NielsenIQ is a global measurement and data analytics company that provides the most complete and trusted view available of consumers and markets worldwide. We provide consumer packaged goods manufacturers/fast-moving consumer goods and retailers with accurate, actionable information and insights and a complete picture of the complex and changing marketplace that companies need to innovate and grow. Our approach marries proprietary NielsenIQ data with other data sources to help clients around the world understand what’s happening now, what’s happening next, and how to best act on this knowledge. We like to be in the middle of the action. That’s why you can find us at work in over 90 countries, covering more than 90% of the world’s population. For more information, visit www.niq.com. NielsenIQ is committed to hiring and retaining a diverse workforce. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Job Description About the job Help our clients-internal and external-understand and use RMS services better by understanding their requirements, queries, and helping address the same through knowledge of data science and RMS. Responsibilities Building knowledge of Nielsen suite of products and demonstrating the same Understanding client concerns Able to put forth ways and means of solving client concerns with supervision Automation and development of solutions for existing processes Taking initiative to understand concerns/problems in the RMS product and participating in product improvement initiatives About the job Help our clients-internal and external-understand and use RMS services better by understanding their requirements, queries, and helping address the same through knowledge of data science and RMS. Responsibilities Building knowledge of Nielsen suite of products and demonstrating the same Understanding client concerns Able to put forth ways and means of solving client concerns with supervision Automation and development of solutions for existing processes Taking initiative to understand concerns/problems in the RMS product and participating in product improvement initiatives Qualifications Professionals with degrees in Maths, Data Science, Statistics, or related fields involving statistical analysis of large data sets 2-3 years of experience in market research or relevant field Mindset and Approach to work: Embraces change, innovation and iterative processes in order to continuously improve the products value to clients. Continuously collaborate & support to improve the product. Active interest in arriving at collaboration and consensus in communication plans, deliverables and deadlines Plans and completes assignments independently within an established framework, breaking down complex tasks, making reasonable decisions. Work is reviewed for overall technical soundness. Participates in data experiments and PoCs, setting measurable goals, timelines and reproducible outcomes. Applies critical thinking and takes initiative. Continuously reviews the latest industry innovations and effectively applies them to their work Consistently challenges and analyzes data to ensure accuracy. Functional Skills: Ability to manipulate, analyze and interpret large data sources Experienced in high-level programming languages (f.e. Python, R, SQL, Scala), as well as with data visualization tools (e.g. Power BI, Spotfire, Tableau, MicroStrategy) Able to work in virtual environment. Familiar with git/Bitbucket processes People with at least some experience in RMS, NIQ, will have an advantage Can use a logical reasoning process to break down and work through increasingly challenging situations or problems to arrive at positive outcomes. Identify and use data from various sources to influence decisions Interpret effectively the data in relation to business objectives Soft Skills Ability to engage/communicate with team and extended team members Can adapt to change and new ideas or ways of working. Exhibits emotional intelligence when partnering with internal and external stakeholders Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion

Posted 1 week ago

Apply

3.0 years

35 Lacs

Kolkata, West Bengal, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 1 week ago

Apply

3.0 years

35 Lacs

Cuttack, Odisha, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 1 week ago

Apply

3.0 years

35 Lacs

Bhubaneswar, Odisha, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 1 week ago

Apply

3.0 years

35 Lacs

Raipur, Chhattisgarh, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 1 week ago

Apply

3.0 years

35 Lacs

Ranchi, Jharkhand, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 1 week ago

Apply

3.0 years

35 Lacs

Guwahati, Assam, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 1 week ago

Apply

3.0 years

35 Lacs

Amritsar, Punjab, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 1 week ago

Apply

3.0 years

35 Lacs

Jamshedpur, Jharkhand, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 1 week ago

Apply

6.0 - 12.0 years

8 - 10 Lacs

Chennai

On-site

Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities, and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* The Analytics and Intelligence Engine (AIE) team transforms analytical and operational data into Consumer and Wealth Client insights and enables personalization opportunities that are provided to Associate and Customer-facing operational applications. The Big data technologies used in this are Hadoop /PySpark / Scala, HQL as ETL, Unix as file Landing environment, and real time (or near real time) streaming applications. Job Description* We are actively seeking a talented and motivated Senior Hadoop Developer/ Lead to join our dynamic and energetic team. As a key contributor to our agile scrum teams, you will collaborate closely with the Insights division. We are looking for a candidate who can showcase strong technical expertise in Hadoop and related technologies, and who excels at collaborating with both onshore and offshore team members. The role requires both hands-on coding and collaboration with stakeholders to drive strategic design decisions. While functioning as an individual contributor for one or more teams, the Senior Hadoop Data Engineer may also have the opportunity to lead and take responsibility for end-to-end solution design and delivery, based on the scale of implementation and required skillsets. Responsibilities* Develop high-performance and scalable solutions for Insights, using the Big Data platform to facilitate the collection, storage, and analysis of massive data sets from multiple channels. Utilize your in-depth knowledge of Hadoop stack and storage technologies, including HDFS, Spark, Scala, MapReduce, Yarn, Hive, Sqoop, Impala, Hue, and Oozie, to design and optimize data processing workflows. Implement Near real-time and Streaming data solutions to provide up-to-date information to millions of Bank customers using Spark Streaming, Kafka. Collaborate with cross-functional teams to identify system bottlenecks, benchmark performance, and propose innovative solutions to enhance system efficiency. Take ownership of defining Big Data strategies and roadmaps for the Enterprise, aligning them with business objectives. Apply your expertise in NoSQL technologies like MongoDB, SingleStore, or HBase to efficiently handle diverse data types and storage requirements. Stay abreast of emerging technologies and industry trends related to Big Data, continuously evaluating new tools and frameworks for potential integration. Provide guidance and mentorship to junior teammates. Requirements* Education* Graduation / Post Graduation: BE/B.Tech/MCA Certifications If Any: NA Experience Range* 6 to 12 Years Foundational Skills* Minimum of 7 years of industry experience, with at least 5 years focused on hands-on work in the Big Data domain. Highly skilled in Hadoop stack technologies, such as HDFS, Spark, Hive, Yarn, Sqoop, Impala and Hue. Strong proficiency in programming languages such as Python, Scala, and Bash/Shell Scripting. Excellent problem-solving abilities and the capability to deliver effective solutions for business-critical applications. Strong command of Visual Analytics Tools, with a focus on Tableau. Desired Skills* Experience in Real-time streaming technologies like Spark Streaming, Kafka, Flink, or Storm. Proficiency in NoSQL technologies like HBase, MongoDB, SingleStore, etc. Familiarity with Cloud Technologies such as Azure, AWS, or GCP. Working knowledge of machine learning algorithms, statistical analysis, and programming languages (Python or R) to conduct data analysis and develop predictive models to uncover valuable patterns and trends. Proficiency in Data Integration and Data Security within the Hadoop ecosystem, including knowledge of Kerberos. Work Timings* 12:00 PM to 09.00 PM IST. Job Location* Chennai, Mumbai

Posted 1 week ago

Apply

2.0 - 4.0 years

2 - 3 Lacs

Chennai

On-site

The Data Science Analyst 2 is a developing professional role. Applies specialty area knowledge in monitoring, assessing, analyzing and/or evaluating processes and data. Identifies policy gaps and formulates policies. Interprets data and makes recommendations. Researches and interprets factual information. Identifies inconsistencies in data or results, defines business issues and formulates recommendations on policies, procedures or practices. Integrates established disciplinary knowledge within own specialty area with basic understanding of related industry practices. Good understanding of how the team interacts with others in accomplishing the objectives of the area. Develops working knowledge of industry practices and standards. Limited but direct impact on the business through the quality of the tasks/services provided. Impact of the job holder is restricted to own team. Responsibilities: The Data Engineer is responsible for building Data Engineering Solutions using next generation data techniques. The individual will be working with tech leads, product owners, customers and technologists to deliver data products/solutions in a collaborative and agile environment. Responsible for design and development of big data solutions. Partner with domain experts, product managers, analyst, and data scientists to develop Big Data pipelines in Hadoop Responsible for moving all legacy workloads to cloud platform Work with data scientist to build Client pipelines using heterogeneous sources and provide engineering services for data science applications Ensure automation through CI/CD across platforms both in cloud and on-premises Define needs around maintainability, testability, performance, security, quality and usability for data platform Drive implementation, consistent patterns, reusable components, and coding standards for data engineering processes Convert SAS based pipelines into languages like PySpark, Scala to execute on Hadoop, Snowflake and non-Hadoop ecosystems Tune Big data applications on Hadoop, Cloud and non-Hadoop platforms for optimal performance Applies in-depth understanding of how data analytics collectively integrate within the sub-function as well as coordinates and contributes to the objectives of the entire function. Produces detailed analysis of issues where the best course of action is not evident from the information available, but actions must be recommended/taken. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 2-4 years of total IT experience Experience with Hadoop (Cloudera)/big data technologies /Cloud/AI tools Hands-on experience with HDFS, MapReduce, Hive, Impala, Spark, Kafka, Kudu, Kubernetes, Dashboard tools, Snowflake builts, AWS tools, AI/ML libraries and tools, etc) Experience on designing and developing Data Pipelines for Data Ingestion or Transformation. System level understanding - Data structures, algorithms, distributed storage & compute tools, SQL expertise, Shell scripting, Schedule tools, Scrum/Agile methodologies. Can-do attitude on solving complex business problems, good interpersonal and teamwork skills Education: Bachelor’s/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. - Job Family Group: Technology - Job Family: Data Science - Time Type: Full time - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 week ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

One of the most valuable asset in today's Financial industry is the data which can provide businesses the intelligence essential to making business and financial decisions with conviction. This role will provide an opportunity to you to work on Ratings and Research related data. You will get an opportunity to work on cutting edge big data technologies and will be responsible for development of both Data feeds as well as API work. Location: Hyderabad The Team: RatingsXpress is at the heart of financial workflows when it comes to providing and analyzing data. We provide Ratings and Research information to clients . Our work deals with content ingestion, data feeds generation as well as exposing the data to clients via API calls. This position in part of the Ratings Xpresss team and is focused on providing clients the critical data they need to make the most informed investment decisions possible. Impact: As a member of the Xpressfeed Team in S&P Global Market Intelligence, you will work with a group of intelligent and visionary engineers to build impactful content management tools for investment professionals across the globe. Our Software Engineers are involved in the full product life cycle, from design through release. You will be expected to participate in application designs , write high-quality code and innovate on how to improve the overall system performance and customer experience. If you are a talented developer and want to help drive the next phase for Data Management Solutions at S&P Global and can contribute great ideas, solutions and code and understand the value of Cloud solutions, we would like to talk to you. Whats in it for you: We are currently seeking a Software Developer with a passion for full-stack development. In this role, you will have the opportunity to work on cutting-edge cloud technologies such as Databricks , Snowflake , and AWS , while also engaging in Scala and SQL Server -based database development. This position offers a unique opportunity to grow both as a Full Stack Developer and as a Cloud Engineer , expanding your expertise across modern data platforms and backend development. Responsibilities: Analyze, design and develop solutions within a multi-functional Agile team to support key business needs for the Data feeds Design, implement and test solutions using AWS EMR for content Ingestion. Work on complex SQL server projects involving high volume data Engineer components, and common services based on standard corporate development models, languages and tools Apply software engineering best practices while also leveraging automation across all elements of solution delivery Collaborate effectively with technical and non-technical stakeholders. Must be able to document and demonstrate technical solutions by developing documentation, diagrams, code comments, etc. Basic Qualifications: Bachelors degree in Computer Science, Information Technology, Engineering, or a related field. 3-6 years of experience in application development. Minimum of 2 years of hands-on experience with Scala. Minimum of 2 years of hands-on experience with Microsoft SQL Server. Solid understanding of Amazon Web Services (AWS) and cloud-based development. In-depth knowledge of system architecture, object-oriented programming, and design patterns. Excellent communication skills, with the ability to convey complex ideas clearly both verbally and in writing. Preferred Qualifications: Familiarity with AWS Services, EMR, Auto scaling, EKS Working knowledge of snowflake. Preferred experience in Python development. Familiarity with the Financial Services domain and Capital Markets is a plus. Experience developing systems that handle large volumes of data and require high computational performance.

Posted 1 week ago

Apply

2.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

Company Description NielsenIQ is a global measurement and data analytics company that provides the most complete and trusted view available of consumers and markets worldwide. We provide consumer packaged goods manufacturers/fast-moving consumer goods and retailers with accurate, actionable information and insights and a complete picture of the complex and changing marketplace that companies need to innovate and grow. Our approach marries proprietary NielsenIQ data with other data sources to help clients around the world understand what’s happening now, what’s happening next, and how to best act on this knowledge. We like to be in the middle of the action. That’s why you can find us at work in over 90 countries, covering more than 90% of the world’s population. For more information, visit www.niq.com. NielsenIQ is committed to hiring and retaining a diverse workforce. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Job Description About the job Help our clients-internal and external-understand and use RMS services better by understanding their requirements, queries, and helping address the same through knowledge of data science and RMS. Responsibilities Building knowledge of Nielsen suite of products and demonstrating the same Understanding client concerns Able to put forth ways and means of solving client concerns with supervision Automation and development of solutions for existing processes Taking initiative to understand concerns/problems in the RMS product and participating in product improvement initiatives About The Job Help our clients-internal and external-understand and use RMS services better by understanding their requirements, queries, and helping address the same through knowledge of data science and RMS. Responsibilities Building knowledge of Nielsen suite of products and demonstrating the same Understanding client concerns Able to put forth ways and means of solving client concerns with supervision Automation and development of solutions for existing processes Taking initiative to understand concerns/problems in the RMS product and participating in product improvement initiatives Qualifications Professionals with degrees in Maths, Data Science, Statistics, or related fields involving statistical analysis of large data sets 2-3 years of experience in market research or relevant field Mindset and Approach to work: Embraces change, innovation and iterative processes in order to continuously improve the products value to clients Continuously collaborate & support to improve the product Active interest in arriving at collaboration and consensus in communication plans, deliverables and deadlines Plans and completes assignments independently within an established framework, breaking down complex tasks, making reasonable decisions. Work is reviewed for overall technical soundness. Participates in data experiments and PoCs, setting measurable goals, timelines and reproducible outcomes. Applies critical thinking and takes initiative Continuously reviews the latest industry innovations and effectively applies them to their work Consistently challenges and analyzes data to ensure accuracy Functional Skills: Ability to manipulate, analyze and interpret large data sources Experienced in high-level programming languages (f.e. Python, R, SQL, Scala), as well as with data visualization tools (e.g. Power BI, Spotfire, Tableau, MicroStrategy) Able to work in virtual environment. Familiar with git/Bitbucket processes People with at least some experience in RMS, NIQ, will have an advantage Can use a logical reasoning process to break down and work through increasingly challenging situations or problems to arrive at positive outcomes Identify and use data from various sources to influence decisions Interpret effectively the data in relation to business objectives Soft Skills Ability to engage/communicate with team and extended team members Can adapt to change and new ideas or ways of working Exhibits emotional intelligence when partnering with internal and external stakeholders Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion Show more Show less

Posted 1 week ago

Apply

0 years

6 - 9 Lacs

Calcutta

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Consultant , AI ML Lead! In this role, we are looking for candidates who have relevant years of experience in Text Mining. The Text Mining Scientist (TMS) is expected to play a pivotal bridging role between enterprise database teams, and business /functional resources. At a broad level, the TMS will leverage his/her solutioning expertise to translate the customer’s business need into a techno-analytic problem and appropriately work with database teams to bring large scale text analytic solutions to fruition. The right candidate should have prior experience in developing text mining and NLP solutions using open-source tools. Responsibilities Develop transformative AI/ML solutions to address our clients' business requirements and challenges Project Delivery - This would entail successful delivery of projects involving data Pre-processing, Model Training and Evaluation, Parameter Tuning Manage Stakeholder/Customer Expectations Project Blue Printing and Project Documentation Creating Project Plan Understand and research cutting edge industrial and academic developments in AI/ML with NLP/NLU applications in diverse industries such as CPG, Finance etc. Conceptualize, Design, build and develop solution algorithms which demonstrate the minimum required functionality within tight timelines Interact with clients to collect, synthesize, and propose requirements and create effective analytics/text mining roadmap. Work with digital development teams to integrate and transform these algorithms into production quality applications Do applied research on a wide array of text analytics and machine learning projects, file patents and publish the papers Qualifications we seek in you! Minimum Qualifications / Skills MS in Computer Science, Information systems, or Computer engineering, Systems Engineering with relevant experience in Text Mining / Natural Language Processing (NLP) tools, Data sciences, Big Data and algorithms. Post-Graduation in MBA and Undergraduate degree in any engineering discipline, preferably Computer Science with relevant experience Full cycle experience desirable in atleast 1 Large Scale Text Mining/NLP project from creating a Business use case, Text Analytics assessment/roadmap, Technology & Analytic Solutioning, Implementation and Change Management, considerable experience in Hadoop including development in map-reduce framework Technology Open Source Text Mining paradigms such as NLTK, OpenNLP, OpenCalais , StanfordNLP, GATE, UIMA, Lucene, and cloud based NLU tools such as DialogFlow, MS LUIS Exposure to Statistical Toolkits such as R, Weka, S -Plus, Matlab, SAS-Text Miner Strong Core Java experience in large scale product development and functional knowledge of RDBMs Hands on to programing in the Hadoop ecosystem, and concepts in distributed computing Very good python/R programming skills. Java programming skills a plus Methodology Relevant years of experience in Solutioning & Consulting experience in verticals such as BFSI, CPG, with hands on delivering text analytics on large structured and unstructured data A solid foundation in AI Methodologies like ML, DL, NLP, Neural Networks, Information Retrieval and Extraction, NLG, NLU Exposed to concepts in Natural Language Processing & Statistics, esp., in their application such as Sentiment Analysis, Contextual NLP, Dependency Parsing, Parsing, Chunking, Summarization, etc Demonstrated ability to Conduct look-ahead client research with focus on supplementing and strengthening the client’s analytics agenda with newer tools and techniques Preferred Qualifications/ Skills Technology Expert level of understanding of NLP, NLU and Machine learning/Deep learning methods OpenNLP , OpenCalais, StanfordNLP , GATE, UIMA, Lucene, NoSQL UI development paradigms that would enable Text Mining Insights Visualization, e.g., Adobe Flex Builder, HTML5, CSS3 Linux, Windows, GPU Experience Spark, Scala for distributed computing Deep learning frameworks such as TensorFlow, Keras , Torch, Theano Methodology Social Network modeling paradigms, tools & techniques Text Analytics using Natural Language Processing tools such as Support Vector Machines and Social Network Analysis Previous experience with Text analytics implementations, using open source packages and or SAS-Text Miner Ability to Prioritize, Consultative mindset & Time management skills Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Consultant Primary Location India-Kolkata Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 4, 2025, 6:34:09 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 1 week ago

Apply

5.0 years

0 Lacs

Andhra Pradesh

On-site

Responsibilities Design and Develop Scalable Data Pipelines: Build and maintain robust data pipelines using Python to process, transform, and integrate large-scale data from diverse sources. Orchestration and Automation: Implement and manage workflows using orchestration tools such as Apache Airflow to ensure reliable and efficient data operations. Data Warehouse Management: Work extensively with Snowflake to design and optimize data models, schemas, and queries for analytics and reporting. Queueing Systems: Leverage message queues like Kafka, SQS, or similar tools to enable real-time or batch data processing in distributed environments. Collaboration: Partner with Data Science, Product, and Engineering teams to understand data requirements and deliver solutions that align with business objectives. Performance Optimization: Optimize the performance of data pipelines and queries to handle large scales of data efficiently. Data Governance and Security: Ensure compliance with data governance and security standards to maintain data integrity and privacy. Documentation: Create and maintain clear, detailed documentation for data solutions, pipelines, and workflows. Qualifications Required Skills: 5+ years of experience in data engineering roles with a focus on building scalable data solutions. Proficiency in Python for ETL, data manipulation, and scripting. Hands-on experience with Snowflake or equivalent cloud-based data warehouses. Strong knowledge of orchestration tools such as Apache Airflow or similar. Expertise in implementing and managing messaging queues like Kafka, AWS SQS, or similar. Demonstrated ability to build and optimize data pipelines at scale, processing terabytes of data. Experience in data modeling, data warehousing, and database design. Proficiency in working with cloud platforms like AWS, Azure, or GCP. Strong understanding of CI/CD pipelines for data engineering workflows. Experience working in an Agile development environment, collaborating with cross-functional teams. Preferred Skills: Familiarity with other programming languages like Scala or Java for data engineering tasks. Knowledge of containerization and orchestration technologies (Docker, Kubernetes). Experience with stream processing frameworks like Apache Flink. Experience with Apache Iceberg for data lake optimization and management. Exposure to machine learning workflows and integration with data pipelines. Soft Skills: Strong problem-solving skills with a passion for solving complex data challenges. Excellent communication and collaboration skills to work with cross-functional teams. Ability to thrive in a fast-paced, innovative environment. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Department: Information Technology Location: APAC-India-IT Delivery Center Hyderabad Description Essential Duties and Responsibilities: Develop and maintain data pipelines using Azure native services like ADLS Gen 2, Azure Data Factory, Synapse, Spark, Python, Databricks and AWS Cloud services, Databurst Develop Datasets require for Business Analytics in Power BI and Azure Data Warehouse. Ensure software development principles, standards, and best practices are followed Maintain existing applications and provide operational support. Review and analyze user requirement and write system specifications Ensure quality design, delivery, and adherence to corporate standards. Participate in daily stand-ups, reviews, design sessions and architectural discussion. Other duties may be assigned Role expectations Essential Duties And Responsibilities Develop and maintain data pipelines using Azure native services like ADLS Gen 2, Azure Data Factory, Synapse, Spark, Python, Databricks and AWS Cloud services, Databurst Develop Datasets require for Business Analytics in Power BI and Azure Data Warehouse. Ensure software development principles, standards, and best practices are followed Maintain existing applications and provide operational support. Review and analyze user requirement and write system specifications Ensure quality design, delivery, and adherence to corporate standards. Participate in daily stand-ups, reviews, design sessions and architectural discussion. Other duties may be assigned What We're Looking For Required Qualifications and Skills: 5+yrs Experience in solution delivery for Data Analytics to get insights for various departments in Organization. 5+yrs Experience in delivering solutions using Microsoft Azure Platform or AWS Services with emphasis on data solutions and services. Extensive knowledge on writing SQL queries and experience in performance tuning queries Experience developing software architectures and key software components Proficient in one or more of the following programming languages: C#, Java, Python, Scala, and related open-source frameworks. Understanding of data services including Azure SQL Database, Data Lake, Databricks, Data Factory, Synapse Data modeling experience on Azure DW/ AWS , understanding of dimensional model , star schemas, data vaults Quick learner who is passionate about new technologies. Strong sense of ownership, customer obsession, and drive with a can-do attitude. Team player with great communication skills--listening, speaking, reading, and writing--in English BS in Computer Science, Computer Engineering, or other quantitative fields such as Statistics, Mathematics, Physics, or Engineering. Applicant Privacy Policy Review our Applicant Privacy Policy for additional information. Equal Opportunity Statement Align Technology is an equal opportunity employer. We are committed to providing equal employment opportunities in all our practices, without regard to race, color, religion, sex, national origin, ancestry, marital status, protected veteran status, age, disability, sexual orientation, gender identity or expression, or any other legally protected category. Applicants must be legally authorized to work in the country for which they are applying, and employment eligibility will be verified as a condition of hire. Show more Show less

Posted 1 week ago

Apply

6.0 - 10.0 years

10 - 17 Lacs

Bengaluru

Hybrid

Naukri logo

- Looking for big data engineer having exp in spark, scala and Java

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies