Home
Jobs

244 Nifi Jobs - Page 8

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Us All people need connectivity. The Rakuten Group is reinventing telecom by greatly reducing cost, rewarding big users not penalizing them, empowering more people and leading the human centric AI future. The mission is to connect everybody and enable all to be. Rakuten. Telecom Invented. Job Description Job Title - Data Engineer Location - Bangalore (Onsite) Why should you choose us? Rakuten Symphony is a Rakuten Group company, that provides global B2B services for the mobile telco industry and enables next-generation, cloud-based, international mobile services. Building on the technology Rakuten used to launch Japan’s newest mobile network, we are taking our mobile offering global. To support our ambitions to provide an innovative cloud-native telco platform for our customers, Rakuten Symphony is looking to recruit and develop top talent from around the globe. We are looking for individuals to join our team across all functional areas of our business – from sales to engineering, support functions to product development. Let’s build the future of mobile telecommunications together! What will you do? Our Data Platform team is building a world-class autonomous data service platform to cater services such as data lake as a service, database as a service, data transformation as a service and AI services. We are looking for a Data Engineer to help us build functional systems for our data platform services. For our autonomous data platform services as a Data Engineer you will be responsible for the end-to-end research and development of relevant features for the platform following the product lifecycle of the offerings. If you have in-depth knowledge in Spark , NiFi and other distributed systems, we’d like to meet you. Ultimately, you will develop various self-managed and scalable services of the data platform which will be offered as cloud services to the end users. Roles And Responsibilities Work experience as a Data Engineer or similar software engineering role (3-8 years) Good knowledge of NiFi, Spark and distributed eco systems, knowledge of Kubernetes application development, how to make K8 centric applications is a plus Expertise in development using Java/Scala – Not python Must be able to quickly design and implement tolerant and highly available pipelines using distributed eco systems and should have hands-on experience with any NoSQL DB, Presto/Trino, NiFi, and Airflow – Casandra Sound knowledge of Spring frame work and spring frame work related solutions for web application development Problem-solving attitude, tinkering approach, and creative thinking Our Commitment To You Rakuten Group’s mission is to contribute to society by creating value through innovation and entrepreneurship. By providing high-quality services that help our users and partners grow, We aim to advance and enrich society. To fulfill our role as a Global Innovation Company, we are committed to maximizing both corporate and shareholder value. Job Requirement Responsibilities Research and comparative analysis System architecture design Implement integrations Deploy updates and fixes Perform root cause analysis for production errors Investigate and resolve technical issues Architecture and support documentation Requirements Work experience as a Data Engineer or similar software engineering role (3-8 years) Good knowledge of NiFi, Spark and distributed eco systems, knowledge of Kubernetes application development, how to make K8 centric applications is a plus Expertise in development using Java/Scala – Not python Must be able to quickly design and implement tolerant and highly available pipelines using distributed eco systems and should have hands-on experience with any NoSQL DB, Presto/Trino, NiFi, and Airflow – Casandra Sound knowledge of Spring frame work and spring frame work related solutions for web application development Problem-solving attitude, tinkering approach, and creative thinking Show more Show less

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About The Role Grade Level (for internal use): 10 Position Title : Senior Software Developer The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. What’s in it for you? Opportunities for innovation and learning new state of the art technologies To work in pure agile & scrum methodology Responsibilities Design, and implement software-related projects. Perform analyses and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. Basic Qualifications What we’re Looking For : Bachelor's degree in Computer Science or Equivalent 6+ years’ related experience Passionate, smart, and articulate developer Strong C#, WPF and SQL skills Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests Dependency Injection Able to demonstrate strong OOP skills Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Interest and experience in Environmental and Sustainability content is a plus Agile/Scrum experience a plus Exposure to Data Engineering & Big Data technologies like Hadoop, Spark/Scala, Nifi & ETL is a plus Preferred Qualifications Experience on Docker is a plus Experience working in cloud computing environments such as AWS, Azure or GCP Experience with large scale messaging systems such as Kafka or RabbitMQ or commercial systems. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 315230 Posted On: 2025-05-05 Location: Gurgaon, Haryana, India Show more Show less

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About The Role Grade Level (for internal use): 10 Position Title : Senior Software Developer The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. What’s in it for you? Opportunities for innovation and learning new state of the art technologies To work in pure agile & scrum methodology Responsibilities Design, and implement software-related projects. Perform analyses and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. Basic Qualifications What we’re Looking For : Bachelor's degree in Computer Science or Equivalent 6+ years’ related experience Passionate, smart, and articulate developer Strong C#, WPF and SQL skills Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests Dependency Injection Able to demonstrate strong OOP skills Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Interest and experience in Environmental and Sustainability content is a plus Agile/Scrum experience a plus Exposure to Data Engineering & Big Data technologies like Hadoop, Spark/Scala, Nifi & ETL is a plus Preferred Qualifications Experience on Docker is a plus Experience working in cloud computing environments such as AWS, Azure or GCP Experience with large scale messaging systems such as Kafka or RabbitMQ or commercial systems. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 315230 Posted On: 2025-05-05 Location: Gurgaon, Haryana, India Show more Show less

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

What is Blend Blend is a premier AI services provider, committed to co-creating meaningful impact for its clients through the power of data science, AI, technology, and people. With a mission to fuel bold visions, Blend tackles significant challenges by seamlessly aligning human expertise with artificial intelligence. The company is dedicated to unlocking value and fostering innovation for its clients by harnessing world-class people and data-driven strategy. We believe that the power of people and AI can have a meaningful impact on your world, creating more fulfilling work and projects for our people and clients. For more information, visit www.blend360.com What is the Role? We are seeking a highly skilled Lead Data Engineer to join our data engineering team for an on-premise environment. A large portion of your time will be in the weeds working alongside your team architecture, designing, implementing, and optimizing data solutions. The ideal candidate will have extensive experience in building and optimizing data pipelines, architectures, and data sets, with a strong focus on Python, SQL, Hadoop, HDFS, and Apache NiFi. What you’ll be doing? Design, develop, and maintain robust, scalable, and high-performance data pipelines and data integration solutions. Manage and optimize data storage in Hadoop Distributed File System (HDFS). Design and implement data workflows using Apache NiFi for data ingestion, transformation, and distribution. Collaborate with cross-functional teams to understand data requirements and deliver efficient solutions. Ensure data quality, governance, and security standards are met within the on-premise infrastructure. Monitor and troubleshoot data pipelines to ensure optimal performance and reliability. Automate data workflows and processes to enhance system efficiency. What do we need from you? Bachelor’s degree in computer science, Software Engineering, or a related field 6+ years of experience in data engineering or a related field Strong programming skills in Python and SQL. Hands-on experience with Hadoop ecosystem (HDFS, Hive, etc.). Proficiency in Apache NiFi for data ingestion and flow orchestration. Experience in data modeling, ETL development, and data warehousing concepts. Strong problem-solving skills and ability to work independently in a fast-paced environment. Good understanding of data governance, data security, and best practices in on-premise environments. What do you get in return? Competitive Salary: Your skills and contributions are highly valued here, and we make sure your salary reflects that, rewarding you fairly for the knowledge and experience you bring to the table. Dynamic Career Growth: Our vibrant environment offers you the opportunity to grow rapidly, providing the right tools, mentorship, and experiences to fast-track your career. Idea Tanks : Innovation lives here. Our "Idea Tanks" are your playground to pitch, experiment, and collaborate on ideas that can shape the future. Growth Chats : Dive into our casual "Growth Chats" where you can learn from the best whether it's over lunch or during a laid-back session with peers, it's the perfect space to grow your skills. Snack Zone: Stay fueled and inspired! In our Snack Zone, you'll find a variety of snacks to keep your energy high and ideas flowing. Recognition & Rewards : We believe great work deserves to be recognized. Expect regular Hive-Fives, shoutouts and the chance to see your ideas come to life as part of our reward program. Fuel Your Growth Journey with Certifications: We’re all about your growth groove! Level up your skills with our support as we cover the cost of your certifications . Show more Show less

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Experience: 10+ years Location: BKC, Mumbai Responsibilities: Evangelize, motivate, and enable our customers on their Enterprise Data Cloud journey Participate in the pre and post sales process, helping both the sales, professional services management and product teams to interpret customer use cases Use your deep domain, wide technical and business knowledge to assist customers in defining their data strategy, use case success criteria and frameworks to deliver successful implementations Design and Implement Enterprise Data Cloud architectures and configurations for customers Identify and grow professional services engagements and support subscriptions through the clear demonstration of the value we bring to our customers Design, create and recommend standard best practices design patterns for distributed data pipelines and analytical computing architectures Plan and deliver presentations and workshops to customer/internal stakeholders Write and produce technical documentation, blogs, and knowledge base articles Skills and Experience: The candidate must be having 10+ years of experience Extensive customer facing/consulting experience interacting with large scale distributed data/computing solutions A strong business understanding of how Cloudera technologies solve real world business problems Appreciation of the commercial business cases that drive customer data platform initiatives Experience managing project delivery and leading a technical team Strong experience designing, architecting, and implementing software solutions in an enterprise Linux environment, including solid foundation in OS / networking fundamentals Strong experience with Hadoop or related technologies including deployment & administration Excellent communication skills, experience with public speaking and able to present to a wide range of audiences Proven knowledge of big data/analytical use cases and best practice approaches to implement solutions to them Strong experience with Cloud Platforms (i.e. AWS, Azure, Google Cloud) Experience with open-source ecosystem programming languages (i.e. Python, Java, Scala, Spark etc.) Knowledge of the data management ecosystem including Concepts of data warehousing, ETL, data integration, etc. Experience designing frameworks for implementing data transformation and processing solutions on Hadoop or related technologies (i.e. HDFS, HIVE, Impala, HBase, NiFi etc.) Strong understanding of authentication (i.e. LDAP , Active Directory, SAML, Kerberos etc.) & authorization confi guration for Hadoop based distributed systems Deep knowledge of the Data/Big Data business domain Familiarity with BI tools and Data Science notebooks such as Cloudera Data Science Workbench, Apache Zeppelin, Jupyter, IBM Watson Studio etc. Knowledge of scripting tools such as bash shell scripts, Python or Perl Familiarity of DevOps methodology & toolsets and automation experience with Chef, Puppet, Ansible or Jenkins Ability to travel ~70% Show more Show less

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Principal Lead- Backend (Java) Company Overview: Birdeye Inc. is a global leader in customer experience management, empowering businesses with innovative solutions to enhance customer interactions and drive growth. Job Description: We are seeking a Principal lead with strong technical expertise in Java, Spring framework, Kafka, MySQL, and NoSQL databases . The ideal candidate will be responsible for defining system architecture, ensuring scalability, and driving technical innovation. This role requires hands-on experience in designing large-scale backend systems, optimizing performance, and mentoring engineering teams. Responsibilities: Define system architecture and ensure scalability, security, and performance of backend services. Lead end-to-end execution of complex projects , from design to deployment. Drive PoCs and evaluate new technologies for improving backend efficiency and innovation. Optimize system efficiency, troubleshoot critical production issues, and improve system reliability. Architect and implement scalable solutions for real-time data processing using Kafka and NiFi. Design, optimize, and manage large-scale data storage with MySQL and NoSQL databases. Collaborate with product managers, architects, and other teams to align technical solutions with business goals. Provide technical leadership, mentorship, and coaching to engineering teams. Enforce best practices in coding, performance optimization, and security. Requirements: Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience). 7+ years of experience in software engineering, with a strong focus on backend development. Expertise in Java, Spring framework , and designing high-performance, scalable systems. Hands-on experience with Kafka and NiFi for event-driven architectures and large-scale data processing. Deep understanding of MySQL and NoSQL databases , including data optimization and management. Strong leadership and technical decision-making skills, with experience mentoring engineers. This role is ideal for someone who thrives on solving complex technical challenges, driving innovation, and leading high-performing engineering teams. Interested candidates, please send their resumes to iqbal.kaur@birdeye.com Regards Iqbal kaur Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About _VOIS: _VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, _VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. _VOIS Centre Intro About _VOIS India: In 2009, _VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, _VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Job Role Related Content (Role specific) Experience managing the development lifecycle for agile software development projects Expert level experience in designing, building and managing data pipelines for batch and streaming applications Experience with performance tuning for batch-based applications like Hadoop, including working knowledge using Nifi, Yarn, Hive, Airflow and Spark Experience with performance tuning streaming-based applications for real-time data processing using Kafka, Confluent Kafka, AWS Kinesis, GCP pub/sub or similar Experience working with serverless services such as Openshift, GCP or AWS Working experience with AWS would be a prerequisite Working experience with other distributed technologies such as CassandraDB, DynamoDB, MongoDB, Elastic Search and Flink would be desirable _VOIS Equal Opportunity Employer Commitment India _VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion , Top 10 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch! Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Senior Associate / Manager ? Nifi Developer Job Location: Pan India Candidate should possess 8 to 12years of experience where 3+ years? should be relevant. Roles & Responsibilities: ? Design, develop, and manage data pipelines using Apache NiFi ? Integrate with systems like Kafka, HDFS, Hive, Spark, and RDBMS ? Monitor, troubleshoot, and optimize data flows ? Ensure data quality, reliability, and security ? Work with cross-functional teams to gather requirements and deliver data solutions Skills Required: ? Strong hands-on experience with Apache NiFi ? Knowledge of data ingestion, streaming, and batch processing ? Experience with Linux, shell scripting, and cloud environments (AWS/GCP is a plus) ? Familiarity with REST APIs, JSON/XML, and data transformation Skills Required RoleNifi Developer - SA/M Industry TypeIT/ Computers - Software Functional AreaIT-Software Required EducationAny Graduates Employment TypeFull Time, Permanent Key Skills NIFI DEVELOPER DESIGN DEVELOPMENT Other Information Job CodeGO/JC/21435/2025 Recruiter NameSPriya Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Requirements Description and Requirements Join our team and what we’ll accomplish together The Wireless Core Network Development team is responsible for End to End network architecture, development, and operations including service orchestration and automation. The team designs, develops, maintains, and supports our Core Wireless Network and all its services specific to our customer data. We work as a team to introduce the latest technology and software to enable network orchestration and automation in the fast evolving 5G ecosystem, and propel TELUS’ digital transformation. Our team creates previously impossible solutions by leveraging the new approach to enable our customers unique and rich wireless experiences. These innovative solutions will improve the life quality of thousands while revolutionizing how everything and everyone connects. You will own the customer experience by providing strategy, managing change and leveraging best in class security and AI to deliver reliable products to our customers. This will represent a fundamental change on how the Telecom industry works, opening the possibility of making private cellular networks globally available, sparking innovation and enabling access to the digital world to more people by providing never seen reliability at reduced costs. What you'll do Overall responsibility for the architecture, design and operational support of TELUS subscriber database solutions (HLR, HSS, EIR, IMEIDB, UDM, UDR); This includes but is not limited to understanding fully how the current network is architected & identifying areas of improvement/modernization that we need to undertake driving reliability and efficiency in the support of the solution Help us design, develop, and implement software solutions supporting the subscriber data platforms within the 5G core architecture.. This will include management, assurance and closed-loop of the UDM, AUSF and SDL which will reside on a cloud native services Bring your ideas, bring your coding skills, and bring your passion to learn Identify E2E network control signaling and roaming gap, available and ongoing design, together with architecting future-friendly solutions as technology evolves Collaborate with cross functional teams from Radio, Core, Transport, Infrastructure, Business and assurance domain, define migration strategies for moving services to cloud. Bring your experience in Open API, security, configuration, data model management and processing Node JS, and learn or bring your experience in other languages like RESTful, JSON, NETCONF, Apache Nifi, Kafka, SNMP, Java, Bash, Python, HTTPS, SSH TypeScript and Python Maintain/develop Network Architecture/Design document Additional Job Description What you bring: 5+ years of telecommunication experience Experienced in adapter API design using RESTful, NETCONF, interested in developing back-end software Proven knowledge of technologies such as Service Based Architecture (SBA), Subscriber Data Management functions, http2, Diameter, Sigtran, SS7, and 5G Protocol General understanding of TCP/IP networking and familiarity with TCP, UDP, SS7 RADIUS, and Diameter protocols along with SOAP/REST API working principles Proven understanding of IPSEC, TLS 1.2, 1.3 and understanding of OAUTH 2.0 framework 2 + years’ experience as a software developer, advanced technical and analytical skills, and the ability to take responsibility for the overall technical direction of the project Experience with Public Cloud Native Services like Openshift, AWS, GCP or Azure Expert knowledge in Database redundancy, replication, Synchronization Knowledge of different database concepts (relational vs non-relational DB) Subject Matter Expert in implementing, integrating, and deploying solutions related to subscriber data management (HLR, HSS, EIR, IMEIDB, UDM, UDR,F5, Provisioning GW, AAA on either private cloud or public cloud like AWS, OCP or GCP Expert knowledge of the software project lifecycle and CI/CD Pipelines A Bachelor degree in Computer Science, Computer Engineering, Electrical Engineering, STEM related field or relevant experience Great-to-haves: Understanding of 3GPP architectures and reference points for 4G and 5G wireless networks Knowledge of 3GPP, TMF, GSMA, IETF standard bodies Experience with Radio, Core, Transport and Infrastructure product design, development, integration, test and operations low level protocol implementation on top of UDP, SCTP, GTPv1 and GTPv2 Experience with MariaDB, Cassandra DB, MongoDB and Data Model Management AWS Fargate, Lambda, DynamoDB, SQS, Step Functions, CloudWatch, CloudFormation and/or AWS Cloud Development Kit Knowledge of Python, and API development in production environments Experience with containerization tools such as Docker, Kubernetes, and/or OpenStack technology Soft Skills: Strong analytical and problem-solving abilities Excellent communication skills, both written and verbal Ability to work effectively in a team environment Self-motivated with a proactive approach to learning new technologies Capable of working under pressure and managing multiple priorities EEO Statement At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service - all backed by TELUS, our multi-billion dollar telecommunications parent. Equal Opportunity Employer At TELUS Digital, we are proud to be an equal opportunity employer and are committed to creating a diverse and inclusive workplace. All aspects of employment, including the decision to hire and promote, are based on applicants’ qualifications, merits, competence and performance without regard to any characteristic related to diversity. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Role: .Net Solution Architect Domain: Insurance Stack - MS Technology, Azure, Full stack etc.. (read below) What we look for? Strong Architecture, Conceptual clarity, problem solving, ability, Leadership quality, Client facing exp, depth in design, integration architecture. Responsibilities: For a change , we are mentioning what can qualify a profile and the depth of architecture required, incumbents may go through the requirements and share us their resume with a separate pg containing details on the asked questions. 1. Scaled Architecture Design Evaluation will be on candidate’s ability to design systems at scale (e.g., NPCI UPI, Uber, ONDC , RedBus): How well do you decompose monolithic systems into scalable components (e.g., ONDC architecture, NPCI as a central switch, geospatial for Uber)? Do you address critical non-functional requirements (NFRs) like latency (<1s for UPI), throughput (10k+ TPS), and fault tolerance? Patterns & Frameworks: Usage of event-driven architecture (Kafka), CQRS, Saga pattern (for distributed transactions), and Circuit Breaker (resilience). Challenges: Assess their approach to interoperability (UPI’s PSP integration), real-time data sync (Uber’s driver tracking), and overbooking prevention (RedBus). 2. Architecture Patterns vs. Styles How do you justify pattern choices (e.g., Bulkhead for RedBus vs. Service Mesh for Uber)? 3. Standards & Governance Compliance: Familiarity with TOGAF, ISO/IEC 42010, or Zachman frameworks. API Governance: Experience with tools like Apigee for centralized API management. 4. Self-Healing Systems Use of Kubernetes (auto-scaling, pod recovery), Chaos Engineering (Chaos Monkey), and observability stacks (Prometheus/Grafana). How do you implement retry mechanisms, circuit breakers, and automated rollbacks? 5. D2C Platform Expertise Proficiency in headless commerce (e.g., Contentful), CDPs (Customer Data Platforms like Segment), and API-first design. 6. Integration Architecture Experience with protocol transformation (REST to SOAP), message queuing (RabbitMQ/Kafka), and ETL pipelines (Apache NiFi 7. Security Architecture Beyond SSO/Auth: How do you implement zero-trust models, WAFs (Cloudflare), and SIEM tools (Splunk)? Do you address OWASP Top 10 risks (e.g., SQLi, XSS) and data encryption (AES-256, TLS 1.3)? 8. Platform KPIs & Metrics Focus on uptime (99.99% SLA), MTTR (<1hr), TPS, and cost optimization. How do you align KPIs with business goals (e.g., user growth vs. infrastructure scaling)? 9. Edge Computing – Edged based Processing Use of AWS Greengrass/Kubernetes Edge for local data processing. How do you address latency reduction, offline capabilities, and edge-node security? Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Agra, Uttar Pradesh, India

On-site

Linkedin logo

Major Accountabilities Collaborate with the CIO on application Architecture and Design of our ETL (Extract, Transform, Load) and other aspects of Data Pipelines. Our stack is built on top of the well-known Spark Ecosystem (e.g. Scala, Python, etc.) Periodically evaluate architectural landscape for efficiencies in our Data Pipelines and define current state, target state architecture and transition plans, road maps to achieve desired architectural state Conducts/leads and implements proof of concepts to prove new technologies in support of architecture vision and guiding principles (e.g. Flink) Assist in the ideation and execution of architectural principles, guidelines and technology standards that can be leveraged across the team and organization. Specially around ETL & Data Pipelines Promotes consistency between all applications leveraging enterprise automation capabilities Provide architectural consultation, support, mentoring, and guidance to project teams, e.g. architects, data scientist, developers, etc. Collaborate with the DevOps Lead on technical features Define and manage work items using Agile methodologies (Kanban, Azure boards, etc) Leads Data Engineering efforts (e.g. Scala Spark, PySpark, etc) Knowledge & Experience Experienced with Spark, Delta Lake, and Scala to work with Petabytes of data (to work with Batch and Streaming flows) Knowledge of a wide variety of open source technologies including but not limited to; NiFi, Kubernetes, Docker, Hive, Oozie, YARN, Zookeeper, PostgreSQL, RabbitMQ, Elasticsearch A strong understanding of AWS/Azure and/or technology as a service (Iaas, SaaS, PaaS) Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations and virtual teams Appreciation of building high volume, low latency systems for the API flow Core Dev skills (SOLID principles, IOC, 12-factor app, CI-CD, GIT) Messaging, Microservice Architecture, Caching (Redis), Containerization, Performance, and Load testing, REST APIs Knowledge of HTML, JavaScript frameworks (preferably Angular 2+), Typescript Appreciation of Python and C# .NET Core or Java Appreciation of global data privacy requirements and cryptography Experience in System Testing and experience of automated testing e.g. unit tests, integration tests, mocking/stubbing Relevant Industry And Other Professional Qualifications Tertiary qualifications (degree level) We are an inclusive employer and welcome applicants from all backgrounds. We pride ourselves on our commitment to Equality and Diversity and are committed to removing barriers throughout our hiring process. Key Requirements Extensive data engineering development experience (e.g., ETL), using well known stacks (e.g., Scala Spark) Experience in Technical Leadership positions (or looking to gain experience) Background software engineering The ability to write technical documentation Solid understanding of virtualization and/or cloud computing technologies (e.g., docker, Kubernetes) Experience in designing software solutions and enjoys UML and the odd sequence diagram Experience operating within an Agile environment Ability to work independently and with minimum supervision Strong project development management skills, with the ability to successfully manage and prioritize numerous time pressured analytical projects/work tasks simultaneously Able to pivot quickly and make rapid decisions based on changing needs in a fast-paced environment Works constructively with teams and acts with high integrity Passionate team player with an inquisitive, creative mindset and ability to think outside the box. Skills:- Java, Scala, Apache Spark, Spark, Hadoop and ETL Show more Show less

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Persistent We are an AI-led, platform-driven Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world, including 12 of the 30 most innovative global companies, 60% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our disruptor’s mindset, commitment to client success, and agility to thrive in the dynamic environment have enabled us to sustain our growth momentum by reporting $1,409.1M revenue in FY25, delivering 18.8% Y-o-Y growth. Our 23,900+ global team members, located in 19 countries, have been instrumental in helping the market leaders transform their industries. We are also pleased to share that Persistent won in four categories at the prestigious 2024 ISG Star of Excellence™ Awards , including the Overall Award based on the voice of the customer. We were included in the Dow Jones Sustainability World Index, setting high standards in sustainability and corporate responsibility. We were awarded for our state-of-the-art learning and development initiatives at the 16 th TISS LeapVault CLO Awards. In addition, we were cited as the fastest-growing IT services brand in the 2024 Brand Finance India 100 Report. Throughout our market-leading growth, we’ve maintained a strong employee satisfaction score of 8.2/10. At Persistent, we embrace diversity to unlock everyone's potential. Our programs empower our workforce by harnessing varied backgrounds for creative, innovative problem-solving. Our inclusive environment fosters belonging, encouraging employees to unleash their full potential. For more details please login to www.persistent.com About The Position We are looking for a Data Architect with creativity and results-oriented critical thinking to meet complex challenges and develop new strategies for acquiring, analyzing, modeling and storing data. In this role you will guide the company into the future and utilize the latest technology and information management methodologies to meet our requirements for effective logical data modeling, metadata management and database warehouse domains. You will be working with experts in a variety of industries, including computer science and software development, as well as department heads and senior executives to integrate new technologies and refine system performance. We reward dedicated performance with exceptional pay and benefits, as well as tuition reimbursement and career growth opportunities. What You?ll Do Define data retention policies Monitor performance and advise any necessary infrastructure changes Mentor junior engineers and work with other architects to deliver best in class solutions Implement ETL / ELT process and orchestration of data flows Recommend and drive adoption of newer tools and techniques from the big data ecosystem Expertise You?ll Bring 10+ years in industry, building and managing big data systems Building, monitoring, and optimizing reliable and cost-efficient pipelines for SaaS is a must Building stream-processing systems, using solutions such as Storm or Spark-Streaming Dealing and integrating with data storage systems like SQL and NoSQL databases, file systems and object storage like s3 Reporting solutions like Pentaho, PowerBI, Looker including customizations Developing high concurrency, high performance applications that are database-intensive and have interactive, browser-based clients Working with SaaS based data management products will be an added advantage Proficiency and expertise in Cloudera / Hortonworks Spark HDF and NiFi RDBMS, NoSQL like Vertica, Redshift, Data Modelling with physical design and SQL performance optimization Messaging systems, JMS, Active MQ, Rabbit MQ, Kafka Big Data technology like Hadoop, Spark, NoSQL based data-warehousing solutions Data warehousing, reporting including customization, Hadoop, Spark, Kafka, Core java, Spring/IOC, Design patterns Big Data querying tools, such as Pig, Hive, and Impala Open-source technologies and databases (SQL & NoSQL) Proficient understanding of distributed computing principles Ability to solve any ongoing issues with operating the cluster Scale data pipelines using open-source components and AWS services Cloud (AWS), provisioning, capacity planning and performance analysis at various levels Web-based SOA architecture implementation with design pattern experience will be an added advantage Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. Inclusive Environment We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent - persistent.com/careers Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Delhi, Delhi

On-site

Indeed logo

Job Description: Hadoop & ETL Developer Job Summary We are looking for a Hadoop & ETL Developer with strong expertise in big data processing, ETL pipelines, and workflow automation. The ideal candidate will have hands-on experience in the Hadoop ecosystem, including HDFS, MapReduce, Hive, Spark, HBase, and PySpark, as well as expertise in real-time data streaming and workflow orchestration. This role requires proficiency in designing and optimizing large-scale data pipelines to support enterprise data processing needs. Key Responsibilities Design, develop, and optimize ETL pipelines leveraging Hadoop ecosystem technologies. Work extensively with HDFS, MapReduce, Hive, Sqoop, Spark, HBase, and PySpark for data processing and transformation. Implement real-time and batch data ingestion using Apache NiFi, Kafka, and Airbyte. • Develop and manage workflow orchestration using Apache Airflow. Perform data integration across structured and unstructured data sources, including MongoDB and Hadoop-based storage. Optimize MapReduce and Spark jobs for performance, scalability, and efficiency. Ensure data quality, governance, and consistency across the pipeline. Collaborate with data engineering teams to build scalable and high-performance data solutions. Monitor, debug, and enhance big data workflows to improve reliability and efficiency. Required Skills & Experience 3+ years of experience in Hadoop ecosystem (HDFS, MapReduce, Hive, Sqoop, Spark, HBase, PySpark). Strong expertise in ETL processes, data transformation, and data warehousing. Hands-on experience with Apache NiFi, Kafka, Airflow, and Airbyte. Proficiency in SQL and handling structured and unstructured data. Experience with NoSQL databases like MongoDB. Strong programming skills in Python or Scala for scripting and automation. Experience in optimizing Spark and MapReduce jobs for high-performance computing. Good understanding of data lake architectures and big data best practices. Preferred Qualifications Experience in real-time data streaming and processing. Familiarity with Docker/Kubernetes for deployment and orchestration. Strong analytical and problem-solving skills with the ability to debug and optimize data workflows. If you have a passion for big data, ETL, and large-scale data processing, we’d love to hear from you! Job Types: Full-time, Contractual / Temporary Pay: ₹400,000.00 - ₹1,100,000.00 per year Schedule: Day shift Monday to Friday Morning shift Application Question(s): How many years of experience do you have in Big Data ETL? How many years of experience do you have in Hadoop? Are you willing to work on contractual basis of job ? Are you comfortable on 3rd party payroll? Are you from Delhi? What is the notice period in your current company? Work Location: In person

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Persistent We are an AI-led, platform-driven Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world, including 12 of the 30 most innovative global companies, 60% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our disruptor’s mindset, commitment to client success, and agility to thrive in the dynamic environment have enabled us to sustain our growth momentum by reporting $1,409.1M revenue in FY25, delivering 18.8% Y-o-Y growth. Our 23,900+ global team members, located in 19 countries, have been instrumental in helping the market leaders transform their industries. We are also pleased to share that Persistent won in four categories at the prestigious 2024 ISG Star of Excellence™ Awards , including the Overall Award based on the voice of the customer. We were included in the Dow Jones Sustainability World Index, setting high standards in sustainability and corporate responsibility. We were awarded for our state-of-the-art learning and development initiatives at the 16 th TISS LeapVault CLO Awards. In addition, we were cited as the fastest-growing IT services brand in the 2024 Brand Finance India 100 Report. Throughout our market-leading growth, we’ve maintained a strong employee satisfaction score of 8.2/10. At Persistent, we embrace diversity to unlock everyone's potential. Our programs empower our workforce by harnessing varied backgrounds for creative, innovative problem-solving. Our inclusive environment fosters belonging, encouraging employees to unleash their full potential. For more details please login to www.persistent.com About The Position We are looking for a Data Architect with creativity and results-oriented critical thinking to meet complex challenges and develop new strategies for acquiring, analyzing, modeling and storing data. In this role you will guide the company into the future and utilize the latest technology and information management methodologies to meet our requirements for effective logical data modeling, metadata management and database warehouse domains. You will be working with experts in a variety of industries, including computer science and software development, as well as department heads and senior executives to integrate new technologies and refine system performance. We reward dedicated performance with exceptional pay and benefits, as well as tuition reimbursement and career growth opportunities. What You?ll Do Define data retention policies Monitor performance and advise any necessary infrastructure changes Mentor junior engineers and work with other architects to deliver best in class solutions Implement ETL / ELT process and orchestration of data flows Recommend and drive adoption of newer tools and techniques from the big data ecosystem Expertise You?ll Bring 10+ years in industry, building and managing big data systems Building, monitoring, and optimizing reliable and cost-efficient pipelines for SaaS is a must Building stream-processing systems, using solutions such as Storm or Spark-Streaming Dealing and integrating with data storage systems like SQL and NoSQL databases, file systems and object storage like s3 Reporting solutions like Pentaho, PowerBI, Looker including customizations Developing high concurrency, high performance applications that are database-intensive and have interactive, browser-based clients Working with SaaS based data management products will be an added advantage Proficiency and expertise in Cloudera / Hortonworks Spark HDF and NiFi RDBMS, NoSQL like Vertica, Redshift, Data Modelling with physical design and SQL performance optimization Messaging systems, JMS, Active MQ, Rabbit MQ, Kafka Big Data technology like Hadoop, Spark, NoSQL based data-warehousing solutions Data warehousing, reporting including customization, Hadoop, Spark, Kafka, Core java, Spring/IOC, Design patterns Big Data querying tools, such as Pig, Hive, and Impala Open-source technologies and databases (SQL & NoSQL) Proficient understanding of distributed computing principles Ability to solve any ongoing issues with operating the cluster Scale data pipelines using open-source components and AWS services Cloud (AWS), provisioning, capacity planning and performance analysis at various levels Web-based SOA architecture implementation with design pattern experience will be an added advantage Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. Inclusive Environment We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent - persistent.com/careers Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description How you will contribute Integration Architecture Ownership: Take ownership of the end-to-end integration architecture across all planning tracks (Demand, Supply, etc.) Design and maintain the overall integration strategy, ensuring scalability, reliability, and security. Oversee inbound and outbound data transformations and orchestration processes. Decision Support & Guidance: Support decision-making related to integration disposition, data transformations, and performance assessments. Provide guidance and recommendations on integration approaches, technologies, and best practices. Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Inter-Tenant Data Transfer Design: Design and implement secure and efficient inter-tenant data transfer mechanisms. Ensure data integrity and consistency across different o9 environments. Team Guidance & Mentoring: Provide technical guidance and mentorship to the juniors in the team on building and maintaining interfaces. Share best practices for integration development, testing, and deployment. Conduct code reviews and ensure adherence to coding standards. CI/CD Implementation: Design and implement a robust CI/CD pipeline for integration deployments. Automate integration testing and deployment processes to ensure rapid and reliable releases. Batch Orchestration Design: Design and implement batch orchestration processes for all planning tracks. Optimize batch processing schedules to minimize processing time and resource utilization. Technical Leadership & Implementation: Serve as a technical leader and subject matter expert on o9 integration. Lead and participate in the implementation of end-to-end SCM solutions. Provide hands-on support for troubleshooting and resolving integration issues. Qualifications Experience: Delivered a minimum of 2-3 comprehensive end-to-end SCM Product implementations as a Technical Architect. At least 8 years of experience in SDLC with a key emphasis on architecting, designing, and developing solutions using big data technologies. Technical Skills: Proficiency in SSIS Packages, Python, Pyspark, SQL programming languages. Experience with workflow management tools like Airflow, SSIS. Experience with Amazon Web Services (AWS), AZURE, Google Cloud Infrastructures preferred. Experience working with Parquet, JSON, Restful APIs, HDFS, Delta Lake and query frameworks like Hive, Presto. Deep understanding and hands-on experience with writing orchestration workflows and/or API coding (knowledge of Apache NiFi is a plus). Good hands-on technical expertise in building scalable Interfaces, performance tuning, data cleansing, and validation strategies. Experience working with version control platforms (e.g., GitHub, Azure DevOps). Experience with DeltaLake and Pyspark is a must. Other Skills: Good to have experience in Cloud Data Quality, Source Systems Analysis, Business Rules Validation, Source Target Mapping Design, Performance Tuning, and High-Volume Data Loads. Familiarity with Agile methodology. Proficient in the use of Microsoft Excel/PowerPoint /Visio for analysis and presentation. Excellent communication and interpersonal skills. Strong problem-solving and analytical abilities. Ability to work independently and as part of a team. Proactive and results-oriented. Ability to thrive in a fast-paced environment. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary Headquartered in Singapore, Mondelēz International’s Asia, Middle East and Africa (AMEA) region is comprised of six business units, has more than 21,000 employees and operates in more than 27 countries including Australia, China, Indonesia, Ghana, India, Japan, Malaysia, New Zealand, Nigeria, Philippines, Saudi Arabia, South Africa, Thailand, United Arab Emirates and Vietnam. Seventy-six nationalities work across a network of more than 35 manufacturing plants, three global research and development technical centers and in offices stretching from Auckland, New Zealand to Casablanca, Morocco. Mondelēz International in the AMEA region is the proud maker of global and local iconic brands such as Oreo and belVita biscuits, Kinh Do mooncakes, Cadbury, Cadbury Dairy Milk and Milka chocolate, Halls candy, Stride gum, Tang powdered beverage and Philadelphia cheese. We are also proud to be named a Top Employer in many of our markets. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Software & Applications Technology & Digital Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Title: Sr. DevOps Engineer Location: Ahmedabad (Onsite) Experience: 8+ Years Job Description: 6+ years of experience in a SRE role, deploying and maintaining applications, performance tuning, conducting application upgrades, patches, and supporting continuous integration and deployment tooling 4+ years of experience deploying and maintaining applications in AWS . Experience with Dockers or similar and experience with Kubernetes or similar Experience supporting Hadoop or any other big data platform (Spark, Pyspark/Deltalake, Nifi, Airflow, Hive, Hafs, Kafka, Impala etc...) Skills: Ability to debug issues and solve problems Working knowledge with Jenkins, Ansible, Terraform, ArgoCD Knowledge on any of the scripting language (Bash, shell, PowerShell or Python etc) Administration of databases (MS SQL, Mongo, SSIS) Working knowledge with Linux operating system Strong in operating system concepts, Linux and troubleshooting. Automation and cloud Passion to learn and adapt to new technology We really value team spirit: Transparency and frequent communication is key. At 09, this is not limited by hierarchy, distance, or function Education: Bachelor's degree in computer science, Software Engineering, Information Technology, Industrial Engineering, Engineering Management Cloud (at least one) and Kubernetes administration certification Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Title: Data Architect – Data Integration & Engineering Location: Hybrid Experience: 8+ years Job Summary: We are seeking an experienced Data Architect specializing in data integration, data engineering, and hands-on coding to design, implement, and manage scalable and high-performance data solutions. The ideal candidate should have expertise in ETL/ELT, cloud data platforms, big data technologies, and enterprise data architecture. Key Responsibilities: 1. Data Architecture & Design: Develop enterprise-level data architecture solutions, ensuring scalability, performance, and reliability. Design data models (conceptual, logical, physical) for structured and unstructured data. Define and implement data integration frameworks using industry-standard tools. Ensure compliance with data governance, security, and regulatory policies (GDPR, HIPAA, etc.). 2. Data Integration & Engineering: Implement ETL/ELT pipelines using Informatica, Talend, Apache Nifi, or DBT. Work with batch and real-time data processing tools such as Apache Kafka, Kinesis, and Apache Flink. Integrate and optimize data lakes, data warehouses, and NoSQL databases. 3. Hands-on Coding & Development: Write efficient and scalable code in Python, Java, or Scala for data transformation and processing. Optimize SQL queries, stored procedures, and indexing strategies for performance tuning. Build and maintain Spark-based data processing solutions in Databricks and Cloudera ecosystems. Develop workflow automation using Apache Airflow, Prefect, or similar tools. 4. Cloud & Big Data Technologies: Work with cloud platforms such as AWS (Redshift, Glue), Azure (Data Factory, Synapse), and GCP (BigQuery, Dataflow). Manage big data processing using Cloudera, Hadoop, HBase, and Apache Spark. Deploy containerized data services using Kubernetes and Docker. Automate infrastructure using Terraform and CloudFormation. 5. Governance, Security & Compliance: Implement data security, masking, and encryption strategies. Define RBAC (Role-Based Access Control) and IAM policies for data access. Work on metadata management, data lineage, and cataloging. Required Skills & Technologies: Data Engineering & Integration: ETL/ELT Tools: Informatica, Talend, Apache Nifi, DBT Big Data Ecosystem: Cloudera, HBase, Apache Hadoop, Spark Data Streaming: Apache Kafka, AWS Kinesis, Apache Flink Data Warehouses: Snowflake, AWS Redshift, Google Big Query, Azure Synapse Databases: PostgreSQL, MySQL, MongoDB, Cassandra Programming & Scripting: Languages: Python, Java, Scala Scripting: Shell, PowerShell, Bash Frameworks: PySpark, SparkSQL Cloud & DevOps: Cloud Platforms: AWS, Azure, GCP Containerization & Orchestration: Kubernetes, Docker CI/CD Pipelines: Jenkins, GitHub Actions, Terraform, CloudFormation Security & Governance: Compliance Standards: GDPR, HIPAA, SOC 2 Data Cataloging: Collibra, Alation Access Controls: IAM, RBAC, ABAC Preferred Certifications: AWS Certified Data Analytics – Specialty Microsoft Certified: Azure Data Engineer Associate Google Professional Data Engineer Databricks Certified Data Engineer Associate/Professional Cloudera Certified Data Engineer Informatica Certified Professional Education & Experience: Bachelor's/Master’s degree in Computer Science/ MCA, Data Engineering, or a related field. 8+ years of experience in data architecture, integration, and engineering. Proven expertise in designing and implementing enterprise-scale data solutions. Show more Show less

Posted 3 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

Hyderabad, Telangana

On-site

Indeed logo

Job Information Date Opened 05/23/2025 Industry Information Technology Job Type Full time Work Experience 4-5 years City Hyderabad State/Province Telangana Country India Zip/Postal Code 500059 Job Description KMC is seeking a motivated and adaptable NiFi/Astro/ETL Engineer with 3-4 years of experience in ETL workflows, data integration, and data pipeline management. The ideal candidate will thrive in an operational setting, collaborate well with team members, and demonstrate a readiness to learn and embrace new technologies. This role will focus on the development, maintenance, and support of ETL processes to ensure efficient data workflows and high-quality deliverables. Roles and Responsibilities: Design, implement, and maintain ETL workflows using Apache NiFi, Astro, and other relevant tools. Support data extraction, transformation, and loading (ETL) processes to ensure efficient data flow across systems. Collaborate with data teams to ensure seamless integration of data from various sources, supporting data consistency and availability.Configure and manage data ingestion processes from both structured and unstructured data sources. Monitor ETL processes and data pipelines, troubleshoot and resolve issues in real-time to ensure data accuracy and availability. Provide on-call support as necessary to maintain smooth data operations.Work closely with cross-functional teams to gather requirements, refine workflows, and ensure optimal data solutions. Contribute actively to team discussions, solution planning, and provide input for continuous improvement. Stay updated with industry trends and emerging technologies in data integration and ETL practices. Show willingness to learn and adapt to new tools and methodologies as required by project or team needs. Requirements 3-4 years of experience in ETL workflows, specifically with Apache NiFi and Astro (or similar platforms). Proficient in SQL and experience with data warehousing concepts. Familiarity with scripting languages (e.g., Python, Shell scripting) is a plus. Basic understanding of cloud platforms (AWS, Azure, or Google Cloud) Soft Skills: Strong problem-solving abilities with an operational mindset. Team player with effective communication skills to collaborate well within and across teams. Quick learner, adaptable to new tools, and willing to take on challenges with a positive attitude. Benefits Insurance - Family Term Insurance PF Paid Time Off - 20 days Holidays - 10 days Flexi timing Competitive Salary Diverse & Inclusive workspace

Posted 3 weeks ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Skills: Data Engineer, Python, Spark, Cloudera, onpremise, Azure, Snowflow, kafka, Overview Of The Company Jio Platforms Ltd. is a revolutionary Indian multinational tech company, often referred to as India's biggest startup, headquartered in Mumbai. Launched in 2019, it's the powerhouse behind Jio, India's largest mobile network with over 400 million users. But Jio Platforms is more than just telecom. It's a comprehensive digital ecosystem, developing cutting-edge solutions across media, entertainment, and enterprise services through popular brands like JioMart, JioFiber, and JioSaavn. Join us at Jio Platforms and be part of a fast-paced, dynamic environment at the forefront of India's digital transformation. Collaborate with brilliant minds to develop next-gen solutions that empower millions and revolutionize industries. Team Overview The Data Platforms Team is the launchpad for a data-driven future, empowering the Reliance Group of Companies. We're a passionate group of experts architecting an enterprise-scale data mesh to unlock the power of big data, generative AI, and ML modelling across various domains. We don't just manage data we transform it into intelligent actions that fuel strategic decision-making. Imagine crafting a platform that automates data flow, fuels intelligent insights, and empowers the organization that's what we do. Join our collaborative and innovative team, and be a part of shaping the future of data for India's biggest digital revolution! About the role. Title: Lead Data Engineer Location : Mumbai Responsibilities End-to-End Data Pipeline Development: Design, build, optimize, and maintain robust data pipelines across cloud, on-premises, or hybrid environments, ensuring performance, scalability, and seamless data flow. Reusable Components & Frameworks: Develop reusable data pipeline components and contribute to the team's data pipeline framework evolution. Data Architecture & Solutions: Contribute to data architecture design, applying data modelling, storage, and retrieval expertise. Data Governance & Automation: Champion data integrity, security, and efficiency through metadata management, automation, and data governance best practices. Collaborative Problem Solving: Partner with stakeholders, data teams, and engineers to define requirements, troubleshoot, optimize, and deliver data-driven insights. Mentorship & Knowledge Transfer: Guide and mentor junior data engineers, fostering knowledge sharing and professional growth. Qualification Details Education: Bachelor's degree or higher in Computer Science, Data Science, Engineering, or a related technical field. Core Programming: Excellent command of a primary data engineering language (Scala, Python, or Java) with a strong foundation in OOPS and functional programming concepts. Big Data Technologies: Hands-on experience with data processing frameworks (e.g., Hadoop, Spark, Apache Hive, NiFi, Ozone, Kudu), ideally including streaming technologies (Kafka, Spark Streaming, Flink, etc.). Database Expertise: Excellent querying skills (SQL) and strong understanding of relational databases (e.g., MySQL, PostgreSQL). Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. End-to-End Pipelines: Demonstrated experience in implementing, optimizing, and maintaining complete data pipelines, integrating varied sources and sinks including streaming real-time data. Cloud Expertise: Knowledge of Cloud Technologies like Azure HDInsights, Synapse, EventHub and GCP DataProc, Dataflow, BigQuery. CI/CD Expertise: Experience with CI/CD methodologies and tools, including strong Linux and shell scripting skills for automation. Desired Skills & Attributes Problem-Solving & Troubleshooting: Proven ability to analyze and solve complex data problems, troubleshoot data pipeline issues effectively. Communication & Collaboration: Excellent communication skills, both written and verbal, with the ability to collaborate across teams (data scientists, engineers, stakeholders). Continuous Learning & Adaptability: A demonstrated passion for staying up-to-date with emerging data technologies and a willingness to adapt to new tools. Show more Show less

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Country India Location: Capital Cyberscape, 2nd Floor, Ullahwas, Sector 59, Gurugram, Haryana 122102 Role: Data Engineer Location: Gurgaon Full/ Part-time: Full Time Build a career with confidence. Summary Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do.: Established Data Science & Analytics professional. Creating data mining architectures/models/protocols, statistical reporting, and data analysis methodologies to identify trends in large data sets About The Role Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes Help streamline our data science workflows, adding value to our product offerings and building out the customer lifecycle and retention models Work closely with the data science and business intelligence teams to develop data models and pipelines for research, reporting, and machine learning Be an advocate for best practices and continued learning Key Responsibilities Expert coding proficiency On Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL / ELT tools like Nifi,Snaplogic,DBT Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Designing Data Ingestion and Orchestration Pipelines using nifi, AWS, kafka, spark, control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Role Responsibilities Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Hands on expertise with Snowflake preferably with SnowPro Core Certification Develop a data model/architecture providing integrated data architecture that enables business services with strict quality management and provides the basis for the future knowledge management processes. Act as interface between business and development teams to guide thru solution end-to-end. Define tools used for design specifications, data modelling and data management capabilities with exploration into standard tools. Good understanding of data technologies including RDBMS, No-SQL databases. Requirements A minimum of 6 years prior relevant experience Strong exposure to Data Modelling, Data Access Patterns and SQL Knowledge of Data Storage Fundamentals, Networking Good to Have Exposure of AWS tools/Services Ability to conduct testing at different levels and stages of the project Knowledge of scripting languages like Java, Python Education Bachelor's degree in computer systems, Information Technology, Analytics, or related business area. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Programme Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Country India Location: Capital Cyberscape, 2nd Floor, Ullahwas, Sector 59, Gurugram, Haryana 122102 Role: Data Engineer Location: Gurgaon Full/ Part-time: Full Time Build a career with confidence. Summary Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do.: Established Data Science & Analytics professional. Creating data mining architectures/models/protocols, statistical reporting, and data analysis methodologies to identify trends in large data sets About The Role Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes Help streamline our data science workflows, adding value to our product offerings and building out the customer lifecycle and retention models Work closely with the data science and business intelligence teams to develop data models and pipelines for research, reporting, and machine learning Be an advocate for best practices and continued learning Key Responsibilities Expert coding proficiency On Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL / ELT tools like Nifi,Snaplogic,DBT Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Designing Data Ingestion and Orchestration Pipelines using nifi, AWS, kafka, spark, control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Role Responsibilities Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Hands on expertise with Snowflake preferably with SnowPro Core Certification Develop a data model/architecture providing integrated data architecture that enables business services with strict quality management and provides the basis for the future knowledge management processes. Act as interface between business and development teams to guide thru solution end-to-end. Define tools used for design specifications, data modelling and data management capabilities with exploration into standard tools. Good understanding of data technologies including RDBMS, No-SQL databases. Requirements A minimum of 3 years prior relevant experience Strong exposure to Data Modelling, Data Access Patterns and SQL Knowledge of Data Storage Fundamentals, Networking Good to Have Exposure of AWS tools/Services Ability to conduct testing at different levels and stages of the project Knowledge of scripting languages like Java, Python Education Bachelor's degree in computer systems, Information Technology, Analytics, or related business area. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Programme Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About GSPANN GSPANN is a global IT services and consultancy provider headquartered in Milpitas, California (U.S.A.). With five global delivery centers across the globe, GSPANN provides digital solutions that support the customer buying journeys of B2B and B2C brands worldwide. With a strong focus on innovation and client satisfaction, GSPANN delivers cutting-edge solutions that drive business success and operational excellence. GSPANN helps retail, finance, manufacturing, and high-technology brands deliver competitive customer experiences and increased revenues through our solution delivery, technologies, practices, and operations for each client. For more information, visit www.gspann.com JD for your reference: We are looking for a passionate Data Modeler to build, optimize and maintain conceptual and logical/Physical database models. The Candidate will turn data into information, information into insight and insight into business decisions. Job Position-Data Modeller Experience- 5+ years Location- Hyderabad, Gurugram Skills- Data Modeling, Data Analysis, Cloud and SQL Responsibilities: Design and develop conceptual, logical, and physical data models for databases, data warehouses, and data lakes. Translate business requirements into data structures that fit both OLTP (Online Transaction Processing) and OLAP (Online Analytical Processing) environments.Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. 3+ years of experience as a Data Modeler or in a related role. Proficiency in data modeling tools (Erwin, ER/Studio, SQL Developer Data Modeler). Strong experience with SQL and database technologies (Oracle, SQL Server, MySQL, PostgreSQL). Familiarity with ETL tools (Informatica, Talend, Apache NiFi) and data integration techniques. Knowledge of data warehousing concepts and data lake architecture. Understanding of Big Data technologies (Hadoop, Spark) is a plus. Experience with cloud platforms like AWS, GCP, or Azure Why Choose GSPANN? At GSPANN, we don’t just serve our clients—we co-create. The GSPANNians are passionate technologists who thrive on solving the toughest business challenges, delivering trailblazing innovations for marquee clients. This collaborative spirit fuels a culture where every individual is encouraged to sharpen their skills, feed their curiosity, and take ownership to learn, experiment, and succeed. We believe in celebrating each other’s successes—big or small—and giving back to the communities we call home. If you’re ready to push boundaries and be part of a close-knit team that’s shaping the future of tech, we invite you to carry forward the baton of innovation with us. Let’s Co-Create the Future—Together. Discover Your Inner Technologist Explore and expand the boundaries of tech innovation without the fear of failure. Accelerate Your Learning Shape your career while scripting the future of tech. Seize the ample learning opportunities to grow at a rapid pace. Feel Included At GSPANN, everyone is welcome. Age, gender, culture, and nationality do not matter here, what matters is YOU. Inspire and Be Inspired When you work with the experts, you raise your game. At GSPANN, you’re in the company of marquee clients and extremely talented colleagues. Enjoy Life We love to celebrate milestones and victories, big or small. Ever so often, we come together as one large GSPANN family. Give Back Together, we serve communities. We take steps, small and large so we can do good for the environment, weaving in sustainability and social change in our endeavors. We invite you to carry forward the baton of innovation in technology with us. Let’s Co-Create GSPANN | Consulting Services, Technology Services, and IT Services Provider GSPANN provides consulting services, technology services, and IT services to e-commerce businesses with high technology, manufacturing, and financial services. Show more Show less

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Advisory - Data and Analytics – Staff – Data Engineer(Scala) EY's Advisory Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated advisory services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Advisory Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Big Data Experts with expertise in Financial Services domain hand-on experience with Big data ecosystem. Primary Skills And Key Responsibilities Strong knowledge in Spark, good understanding of Spark framework, Performance tuning. Proficiency in Scala & SQL. Good exposure to one of the Cloud technologies - GCP/ Azure/ AWS Hands-on Experience in designing, building, and maintaining scalable data pipelines and solutions to manage and process large datasets efficiently. Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Nice To Have Skills Design, develop, and deploy robust and scalable data pipelines using GCP services such as BigQuery, Dataflow, Data Composer/Cloud Composer (Airflow) and related technologies. Good experience in experience in GCP technology areas of Data store, Big Query, Cloud storage, Persistent disk IAM, Roles, Projects, Organization. Understanding & familiarity with all Hadoop Ecosystem components and Hadoop Administrative Fundamentals Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Experience in HDFS, Hive, Impala Experience is schedulers like Airflow, Nifi etc Experienced in Hadoop clustering and Auto scaling. Develop standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis. Define and develop client specific best practices around data management within a Hadoop environment on Azure cloud To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 2 years hand-on experience in one or more relevant areas. Total of 1-3 years industry experience Ideally, you’ll also have Experience on Banking and Capital Markets domains Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Advisory practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Advisory - Data and Analytics – Staff – Data Engineer(Scala) EY's Advisory Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated advisory services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Advisory Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Big Data Experts with expertise in Financial Services domain hand-on experience with Big data ecosystem. Primary Skills And Key Responsibilities Strong knowledge in Spark, good understanding of Spark framework, Performance tuning. Proficiency in Scala & SQL. Good exposure to one of the Cloud technologies - GCP/ Azure/ AWS Hands-on Experience in designing, building, and maintaining scalable data pipelines and solutions to manage and process large datasets efficiently. Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Nice To Have Skills Design, develop, and deploy robust and scalable data pipelines using GCP services such as BigQuery, Dataflow, Data Composer/Cloud Composer (Airflow) and related technologies. Good experience in experience in GCP technology areas of Data store, Big Query, Cloud storage, Persistent disk IAM, Roles, Projects, Organization. Understanding & familiarity with all Hadoop Ecosystem components and Hadoop Administrative Fundamentals Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Experience in HDFS, Hive, Impala Experience is schedulers like Airflow, Nifi etc Experienced in Hadoop clustering and Auto scaling. Develop standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis. Define and develop client specific best practices around data management within a Hadoop environment on Azure cloud To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 2 years hand-on experience in one or more relevant areas. Total of 1-3 years industry experience Ideally, you’ll also have Experience on Banking and Capital Markets domains Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Advisory practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Advisory - Data and Analytics – Staff – Data Engineer(Scala) EY's Advisory Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated advisory services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Advisory Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Big Data Experts with expertise in Financial Services domain hand-on experience with Big data ecosystem. Primary Skills And Key Responsibilities Strong knowledge in Spark, good understanding of Spark framework, Performance tuning. Proficiency in Scala & SQL. Good exposure to one of the Cloud technologies - GCP/ Azure/ AWS Hands-on Experience in designing, building, and maintaining scalable data pipelines and solutions to manage and process large datasets efficiently. Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Nice To Have Skills Design, develop, and deploy robust and scalable data pipelines using GCP services such as BigQuery, Dataflow, Data Composer/Cloud Composer (Airflow) and related technologies. Good experience in experience in GCP technology areas of Data store, Big Query, Cloud storage, Persistent disk IAM, Roles, Projects, Organization. Understanding & familiarity with all Hadoop Ecosystem components and Hadoop Administrative Fundamentals Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Experience in HDFS, Hive, Impala Experience is schedulers like Airflow, Nifi etc Experienced in Hadoop clustering and Auto scaling. Develop standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis. Define and develop client specific best practices around data management within a Hadoop environment on Azure cloud To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 2 years hand-on experience in one or more relevant areas. Total of 1-3 years industry experience Ideally, you’ll also have Experience on Banking and Capital Markets domains Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Advisory practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies