Home
Jobs
Companies
Resume

601 Sqoop Jobs - Page 4

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring successful project delivery. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Ensure successful project delivery- Mentor and guide team members Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark- Strong understanding of big data processing- Experience with data pipelines and ETL processes- Hands-on experience in building scalable applications- Knowledge of cloud platforms and services Additional Information:- The candidate should have a minimum of 5 years of experience in PySpark- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education

Posted 4 days ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : A Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will oversee the application development process and ensure successful project delivery. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Ensure timely project delivery- Provide technical guidance and support to the team Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark- Strong understanding of big data processing- Experience with cloud platforms like AWS or Azure- Hands-on experience in designing and implementing scalable applications- Knowledge of data modeling and database management Additional Information:- The candidate should have a minimum of 5 years of experience in PySpark- This position is based at our Pune office- A Engineering graduate preferably Computer Science graduate 15 years of full time education is required Qualification A Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 4 days ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

The Digital Solutions and Innovation (DSI) team within the Citi Internal Audit Innovation function is looking for a Business Analytics Analyst (Officer) to join the Internal Audit Analytics Team. The Analytics Team works with members of Internal Audit to identify opportunities, design, develop and implement analytics in support or the performance of audit activities, along with automation activities to promote efficiencies and expand coverage . The candidate must be proficient in the development and use of analytics technology and tools to provide analytical insight and automated solutions to enhance audit efficiency and effectiveness and have functional knowledge of banking processes and related risks and controls. Key Responsibilities: Participating in the innovative use of audit analytics through direct participation in all phases of audits Supporting the defining of data needs, designing, and executing audit analytics during audits in accordance with the audit methodology and professional standards. Supports execution of automated routines to help focus audit testing. Executes innovation solutions and pre-defined analytics in accordance with standard A&A procedures. Assisting audit teams in performing moderately complex audits related to a specific area of the bank: Consumer Banking, Investment Banking, Risk, Finance, Compliance, and/or Technology. Provide support to other members of the Analytics and Automation team, and wider Digital Solutions and Innovation team. Strong verbal and written communication skills to clearly articulate analytics requirements and results. Develop professional relationships with audit teams to assist in the definition of analytics and automation opportunities. Develop effective working relationships with technology and business teams of the area being audited, to facilitate understanding or processes and sourcing of data. Promoting continuous improvement in all aspects of audit automation activities (e.g., technical environment, software, operating procedures). Key Qualifications And Competencies At least 3 years of business / audit analyst experience in providing analytical techniques and automated solutions to business needs. Work experience in global environment and in large company. Excellent technical, programming and databases skills Excellent analytical ability to understand business processes and related risks and controls and develop innovative audit analytics based upon audit needs. Strong interpersonal and multicultural skills for interfacing with all levels of internal and external audit and management. Self-driven, problem-solving approach. Understanding of procedures and following these to keep quality and security of processes. Detail oriented approach, consistently performing diligent self-reviews of work product, and attention to data completeness and accuracy. Data literate, with the ability to understand and effectively communicate what data means to technical and non-technical stakeholders. Proficiency in one or more of the following technical skills is required : SQL Python Hadoop ecosystem (Hive, Sqoop, PySpark etc). Alteryx Proficiency in at least one of the following Data Visualization tools is a plus: Tableau MicroStrategy Cognos Experience of the following areas would be a plus: Business Intelligence including use of statistics, data modelling, data mining and predictive analytics. Application of data science tools and techniques to advance the insights obtained through the interrogation of data. Working with non-structured data such as PDF files. Banking Businesses (i.e., Institutional Clients Group, Consumer, Corporate Functions) or areas of expertise (i.e. Anti-Money Laundering, Regulatory Reporting) Big Data analysis including big data dedicated use like HUE, Hive Project Management / Solution Development Life Cycle Exposure to Process mining software such as Celonis What we offer: A chance to develop in a highly innovative environment where you can use the newest technologies in a top-quality organizational culture. Professional development in a truly global environment Inclusive and friendly corporate culture where gender diversity and equality is widely recognized A supportive workplace for professionals returning to the office from childcare leave An enjoyable and challenging learning path, which leads to a deep understanding of Citi’s products and services. Yearly discretionary bonus and competitive social benefits (private medical care, multisport, life insurance, award-winning pension scheme, holiday allowance, flexible working schedule and other) This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Business Analysis ------------------------------------------------------ Time Type: ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 4 days ago

Apply

10.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 4 days ago

Apply

10.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 4 days ago

Apply

10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 4 days ago

Apply

8.0 years

6 - 8 Lacs

Chennai

On-site

Develop, test, and deploy data processing applications using Apache Spark and Scala. Optimize and tune Spark applications for better performance on large-scale data sets. Work with the Cloudera Hadoop ecosystem (e.g., HDFS, Hive, Impala, HBase, Kafka) to build data pipelines and storage solutions. Collaborate with data scientists, business analysts, and other developers to understand data requirements and deliver solutions. Design and implement high-performance data processing and analytics solutions. Ensure data integrity, accuracy, and security across all processing tasks. Troubleshoot and resolve performance issues in Spark, Cloudera, and related technologies. Implement version control and CI/CD pipelines for Spark applications. Required Skills & Experience: Minimum 8 years of experience in application development. Strong hands on experience in Apache Spark, Scala, and Spark SQL for distributed data processing. Hands-on experience with Cloudera Hadoop (CDH) components such as HDFS, Hive, Impala, HBase, Kafka, and Sqoop. Familiarity with other Big Data technologies, including Apache Kafka, Flume, Oozie, and Nifi. Experience building and optimizing ETL pipelines using Spark and working with structured and unstructured data. Experience with SQL and NoSQL databases such as HBase, Hive, and PostgreSQL. Knowledge of data warehousing concepts, dimensional modeling, and data lakes. Ability to troubleshoot and optimize Spark and Cloudera platform performance. Familiarity with version control tools like Git and CI/CD tools (e.g., Jenkins, GitLab).

Posted 5 days ago

Apply

12.0 - 15.0 years

35 - 50 Lacs

Hyderabad

Work from Office

Naukri logo

Skill : Java, Spark, Kafka Experience : 10 to 16 years Location : Hyderabad As Data Engineer, you will : Support in designing and rolling out the data architecture and infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources Identify data source, design and implement data schema/models and integrate data that meet the requirements of the business stakeholders Play an active role in the end-to-end delivery of AI solutions, from ideation, feasibility assessment, to data preparation and industrialization. Work with business, IT and data stakeholders to support with data-related technical issues, their data infrastructure needs as well as to build the most flexible and scalable data platform. With a strong focus on DataOps, design, develop and deploy scalable batch and/or real-time data pipelines. Design, document, test and deploy ETL/ELT processes Find the right tradeoffs between the performance, reliability, scalability, and cost of the data pipelines you implement Monitor data processing efficiency and propose solutions for improvements. • Have the discipline to create and maintain comprehensive project documentation. • Build and share knowledge with colleagues and coach junior profiles.

Posted 5 days ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

PositionSenior Data Engineer - Airflow, PLSQL Experience5+ Years LocationBangalore/Hyderabad/Pune Seeking a Senior Data Engineer with strong expertise in Apache Airflow and Oracle PL/SQL, along with working experience in Snowflake and Agile methodologies. The ideal candidate will also take up Scrum Master responsibilities and lead a data engineering scrum team to deliver robust, scalable data solutions. Key Responsibilities: Design, develop, and maintain scalable data pipelines using Apache Airflow. Write and optimize complex PL/SQL queries, procedures, and packages on Oracle databases. Collaborate with cross-functional teams to design efficient data models and integration workflows. Work with Snowflake for data warehousing and analytics use cases. Own the delivery of sprint goals, backlog grooming, and facilitation of agile ceremonies as the Scrum Master. Monitor pipeline health and troubleshoot production data issues proactively. Ensure code quality, documentation, and best practices across the team. Mentor junior data engineers and promote a culture of continuous improvement. Required Skills and Qualifications: 5+ years of experience as a Data Engineer in enterprise environments. Strong expertise in Apache Airflow for orchestrating workflows. Expert in Oracle PL/SQL - stored procedures, performance tuning, debugging. Hands-on experience with Snowflake - data modeling, SQL, optimization. Working knowledge of version control (Git) and CI/CD practices. Prior experience or certification as a Scrum Master is highly desirable. Strong analytical and problem-solving skills with attention to detail. Excellent communication and leadership skills.

Posted 5 days ago

Apply

2.0 - 4.0 years

8 - 12 Lacs

Mumbai

Work from Office

Naukri logo

The SAS to Databricks Migration Developer will be responsible for migrating existing SAS code, data processes, and workflows to the Databricks platform This role requires expertise in both SAS and Databricks, with a focus on converting SAS logic into scalable PySpark and Python code The developer will design, implement, and optimize data pipelines, ensuring seamless integration and functionality within the Databricks environment Collaboration with various teams is essential to understand data requirements and deliver solutions that meet business needs

Posted 5 days ago

Apply

8.0 - 12.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Allime Tech Solutions is looking for Hadoop with Scala Developer to join our dynamic team and embark on a rewarding career journey. A Developer is responsible for designing, developing, and maintaining software applications and systems They collaborate with a team of software developers, designers, and stakeholders to create software solutions that meet the needs of the business Key responsibilities:Design, code, test, and debug software applications and systemsCollaborate with cross-functional teams to identify and resolve software issuesWrite clean, efficient, and well-documented codeStay current with emerging technologies and industry trendsParticipate in code reviews to ensure code quality and adherence to coding standardsParticipate in the full software development life cycle, from requirement gathering to deploymentProvide technical support and troubleshooting for production issues Requirements:Strong programming skills in one or more programming languages, such as Python, Java, C++, or JavaScriptExperience with software development tools, such as version control systems (e g Git), integrated development environments (IDEs), and debugging toolsFamiliarity with software design patterns and best practicesGood communication and collaboration skills

Posted 5 days ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

job requisition idJR1027452 Overall Responsibilities: Data Pipeline Development: Design, develop, and maintain highly scalable and optimized ETL pipelines using PySpark on the Cloudera Data Platform, ensuring data integrity and accuracy. Data Ingestion: Implement and manage data ingestion processes from a variety of sources (e.g., relational databases, APIs, file systems) to the data lake or data warehouse on CDP. Data Transformation and Processing: Use PySpark to process, cleanse, and transform large datasets into meaningful formats that support analytical needs and business requirements. Performance Optimization: Conduct performance tuning of PySpark code and Cloudera components, optimizing resource utilization and reducing runtime of ETL processes. Data Quality and Validation: Implement data quality checks, monitoring, and validation routines to ensure data accuracy and reliability throughout the pipeline. Automation and Orchestration: Automate data workflows using tools like Apache Oozie, Airflow, or similar orchestration tools within the Cloudera ecosystem. Monitoring and Maintenance: Monitor pipeline performance, troubleshoot issues, and perform routine maintenance on the Cloudera Data Platform and associated data processes. Collaboration: Work closely with other data engineers, analysts, product managers, and other stakeholders to understand data requirements and support various data-driven initiatives. Documentation: Maintain thorough documentation of data engineering processes, code, and pipeline configurations. Software : Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques. Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase. Knowledge of data warehousing concepts, ETL best practices, and experience with SQL-based tools (e.g., Hive, Impala). Familiarity with Hadoop, Kafka, and other distributed computing tools. Experience with Apache Oozie, Airflow, or similar orchestration frameworks. Strong scripting skills in Linux. Category-wise Technical Skills: PySpark: Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques. Cloudera Data Platform: Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase. Data Warehousing: Knowledge of data warehousing concepts, ETL best practices, and experience with SQL-based tools (e.g., Hive, Impala). Big Data Technologies: Familiarity with Hadoop, Kafka, and other distributed computing tools. Orchestration and Scheduling: Experience with Apache Oozie, Airflow, or similar orchestration frameworks. Scripting and Automation: Strong scripting skills in Linux. Experience: 5-12 years of experience as a Data Engineer, with a strong focus on PySpark and the Cloudera Data Platform. Proven track record of implementing data engineering best practices. Experience in data ingestion, transformation, and optimization on the Cloudera Data Platform. Day-to-Day Activities: Design, develop, and maintain ETL pipelines using PySpark on CDP. Implement and manage data ingestion processes from various sources. Process, cleanse, and transform large datasets using PySpark. Conduct performance tuning and optimization of ETL processes. Implement data quality checks and validation routines. Automate data workflows using orchestration tools. Monitor pipeline performance and troubleshoot issues. Collaborate with team members to understand data requirements. Maintain documentation of data engineering processes and configurations. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. Relevant certifications in PySpark and Cloudera technologies are a plus. Soft Skills: Strong analytical and problem-solving skills. Excellent verbal and written communication abilities. Ability to work independently and collaboratively in a team environment. Attention to detail and commitment to data quality.

Posted 5 days ago

Apply

5.0 - 10.0 years

13 - 18 Lacs

Gurugram

Work from Office

Naukri logo

Position Summary To be a technology expert architecting solutions and mentoring people in BI / Reporting processes with prior expertise in the Pharma domain. Job Responsibilities o Technology Leadership – Lead guide the team independently or with little support to design, implement deliver complex reporting and BI project assignments. o Technical portfolio – Expertise in a range of BI and hosting technologies like the AWS stack (Redshift, EC2), Qlikview, QlikSense, Tableau, Microstrategy, Spotfire o Project Management – Get accurate briefs from the Client and translate into tasks for team members with priorities and timeline plans. Must maintain high standards of quality and thoroughness. Should be able to monitor accuracy and quality of others' work. Ability to think in advance about potential risks and mitigation plans. o Logical Thinking – Able to think analytically, use a systematic and logical approach to analyze data, problems, and situations. Must be able to guide team members in analysis. o Handle Client Relationship – Manage client relationship and client expectations independently. Should be able to deliver results back to the Client independently. Should have excellent communication skills. Education BE/B.Tech Master of Computer Application Work Experience - Minimum of 5 years of relevant experience in Pharma domain. - Technical: Should have 10+ years of hands on experience in the following tools: Must have working knowledge of toolsAtleast 2 of the following – Qlikview, QlikSense, Tableau, Microstrategy, Spotfire/ (Informatica, SSIS, Talend & metallion)/ Big Data technologies - Hadoop ecosystem. Aware of techniques such asUI design, Report modeling, performance tuning and regression testing Basic expertise with MS excel Advanced expertise with SQL - Functional: Should have experience in following concepts and technologies: Specifics: Pharma data sources like IMS, Veeva, Symphony, Cegedim etc. Business processes like alignment, market definition, segmentation, sales crediting, activity metrics calculation Calculation of all sales, activity and managed care KPIs Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Problem Solving Lifescience Knowledge Communication Project Management Attention to P&L Impact Capability Building / Thought Leadership Scale of revenues managed / delivered

Posted 5 days ago

Apply

2.0 - 7.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Job Area: Engineering Group, Engineering Group > Software Engineering General Summary: As a leading technology innovator, Qualcomm pushes the boundaries of what's possible to enable next-generation experiences and drives digital transformation to help create a smarter, connected future for all. As a Qualcomm Software Engineer, you will design, develop, create, modify, and validate embedded and cloud edge software, applications, and/or specialized utility programs that launch cutting-edge, world class products that meet and exceed customer needs. Qualcomm Software Engineers collaborate with systems, hardware, architecture, test engineers, and other teams to design system-level software solutions and obtain information on performance requirements and interfaces. Minimum Qualifications: Bachelor's degree in Engineering, Information Systems, Computer Science, or related field and 2+ years of Software Engineering or related work experience. OR Master's degree in Engineering, Information Systems, Computer Science, or related field and 1+ year of Software Engineering or related work experience. OR PhD in Engineering, Information Systems, Computer Science, or related field. 2+ years of academic or work experience with Programming Language such as C, C++, Java, Python, etc. Preferred Qualifications: The display software team is looking for talented software engineers interested in developing software for mobile and embedded devices. The display software team is responsible for delivering device drivers and tools for Snapdragon chipsets, providing best in class performance, power and features. This role will involve working on the firmware development for Display. Responsibilities will include the design and development of new features, support for new hardware pre/post-silicon development, debugging of issues within software, optimizing software for performance and power, development of unit tests and working with our partners and OEMs. In addition, they will be working with other technologies including video encoders, video decoders, DSPs, and GPU for QC multimedia cores towards meeting project milestones. Principal Duties and Responsibilities: Detailed oriented with strong analytical and debugging skills. Strong working knowledge of C/C++ programming Knowledge in one or more Operating Systems (or) RTOS (Embedded Linux, Windows) Strong working knowledge of Linux Kernel. Experienced in Linux kernel architecture and driver development, such as signals, priorities, deadlocks, stacks, interrupt, memory management, scheduler, synchronization methods, etc. Understanding of low level software/hardware interface design and debugging Knowledge in one or more of the following disciplines is preferredDisplay (Pixel processing/composition, MIPI DSI, HDMI, DisplayPort, etc.), Experience in the following Display/Graphics Frameworks and platformsAndroid, Weston/Wayland. Added advantage with DRM/KMS driver. . Level of Responsibility: Works under supervision. Decision-making may affect work beyond immediate work group. Requires verbal and written communication skills to convey information. May require basic negotiation, influence, tact, etc. Tasks do not have defined steps; planning, problem-solving, and prioritization must occur to complete the tasks effectively.

Posted 5 days ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Nagercoil

Work from Office

Naukri logo

Managing Sales of Loan Against Property & Business Loans for the Ameerpet Region. Lead a team of Relationship Managers to generate business through Direct sourcing. Building the Sales and distribution network in the assigned territory. Recruit, train and monitor team members & ensuring quality service delivery. Managing loan process from lead generation till disbursement of the loan. Ensure synergy between sales, credit and operation to ensure the efficiency of business processes

Posted 6 days ago

Apply

2.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Profile: Sr. DW BI Developer Location: Sector 64, Noida (Work from Office) Position Overview: Working with the Finance Systems Manager, the role will ensure that ERP system is available and fit for purpose. The ERP Systems Developer will be developing the ERP system, providing comprehensive day-to-day support, training and develop the current ERP System for the future. Key Responsibilities: As a Sr. DW BI Developer, the candidate will participate in the design / development / customization and maintenance of software applications. As a DW BI Developer, the person should analyse the different applications/Products, design and implement DW using best practices. Rich data governance experience, data security, data quality, provenance / lineage. The candidate will also be maintaining a close working relationship with the other application stakeholders. Experience of developing secured and high-performance web application(s) Knowledge of software development life-cycle methodologies e.g. Iterative, Waterfall, Agile, etc. Designing and architecting future releases of the platform. Participating in troubleshooting application issues. Jointly working with other teams and partners handling different aspects of the platform creation. Tracking advancements in software development technologies and applying them judiciously in the solution roadmap. Ensuring all quality controls and processes are adhered to. Planning the major and minor releases of the solution. Ensuring robust configuration management. Working closely with the Engineering Manager on different aspects of product lifecycle management. Demonstrate the ability to independently work in a fast-paced environment requiring multitasking and efficient time management. Required Skills and Qualifications: End to end Lifecyle of Data warehousing, DataLakes and reporting Experience with Maintaining/Managing Data warehouses. Responsible for the design and development of a large, scaled-out, real-time, high performing Data Lake / Data Warehouse systems (including Big data and Cloud). Strong SQL and analytical skills. Experience in Power BI, Tableau, Qlikview, Qliksense etc. Experience in Microsoft Azure Services. Experience in developing and supporting ADF pipelines. Experience in Azure SQL Server/ Databricks / Azure Analysis Services Experience in developing tabular model. Experience in working with APIs. Minimum 2 years of experience in a similar role Experience with data warehousing, data modelling. Strong experience in SQL 2-6 years of total experience in building DW/BI systems Experience with ETL and working with large-scale datasets. Proficiency in writing and debugging complex SQLs. Prior experience working with global clients. Hands on experience with Kafka, Flink, Spark, SnowFlake, Airflow, nifi, Oozie, Pig, Hive,Impala Sqoop. Storage like HDFS , Object Storage (S3 etc), RDBMS, MPP and Nosql DB. Experience with distributed data management, data sfailover,luding databases (Relational, NoSQL, Big data, data analysis, data processing, data transformation, high availability, and scalability) Experience in end-to-end project implementation in Cloud (Azure / AWS / GCP) as a DW BI Developer Rich data governance experience, data security, data quality, provenance / lineagHive, Impalaerstanding of industry trends and products in dataops , continuous intelligence , Augmented analytics , and AI/ML. Prior experience of working in cloud like Azure, AWS and GCP Prior experience of working with Global Clients To know our Privacy Policy, please click on the link below or copy paste the URL on your browser: https://gedu.global/wp-content/uploads/2023/09/GEDU-Privacy-Policy-22092023-V2.0-1.pdf Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Key Responsibilities : Design and develop interactive dashboards, reports, and visualizations using Power BI to drive critical business insights. Write complex SQL queries, stored procedures, and functions to effectively extract, transform, and load (ETL) data from various sources. Optimize and maintain SQL databases, ensuring data integrity, performance, and reliability. Develop robust data models and implement sophisticated DAX calculations in Power BI for advanced analytics. Integrate Power BI with diverse data sources, including various databases, cloud storage solutions, and APIs. Work closely with business stakeholders to meticulously gather requirements and translate them into actionable Business Intelligence solutions. Troubleshoot performance issues related to Power BI dashboards and SQL queries, ensuring optimal system performance. Stay updated with the latest trends and advancements in Power BI, SQL, and the broader field of data analytics. All About You Hands-on experience managing technology projects with demonstrated ability to understand complex data and technology initiatives Ability to lead and influence others to advance deliverables Understanding of emerging technologies including but not limited to, cloud architecture, machine learning/AI and Big Data infrastructure Data architecture experience and experience in building data models. Experience deploying and working with big data technologies like Hadoop, Spark, and Sqoop. Experience with streaming frameworks like Kafka and Axon and pipelines like Nifi, Proficient in OO programming (Python Java/Springboot/J2EE, and Scala) Experience with the Hadoop Ecosystem (HDFS, Yarn, MapReduce, Spark, Hive, Impala), Experience with Linux, Unix command line, Unix Shell Scripting, SQL and any Scripting language Experience with data visualization tools such as Tableau, Domo, and/or PowerBI is a plus. Experience presenting data findings in a readable and insight driven format. Experience building support decks. (ref:hirist.tech) Show more Show less

Posted 6 days ago

Apply

7.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Summary Strategy As part of FM – Funds and Securities Services Technology, the developer is recruited to Develop and Deliver solution to support various initiatives that enables Operations to fulfil Client requirements. Business Technical Requirement Experience in Designing and Implementing Enterprise applications in Java/.Net Should have experience in Oracle /SQL server. Good knowledge on developing stored procedures using Oracle PL\SQL. Knowledge on Big Data concepts and tech stack such as Hadoop, Hive / Spark / Sqoop. Should be able to work on data extraction and data lake initiatives. DevOps (ADO, JIRA, Jenkins, Ansible, Github) exposure / experience. Knowledge on AWS/Azure Cloud native and VM concepts. Proficient in Container Proficiency in Oracle Sql, SqlServer and DBA. Working experience in solution design, capacity plan and sizing. Functional Requirements Experience in Fund Accounting, Transfer Agency and (or) Hedge Fund Administration. Knowledge of market instruments and conventions. Specialism within Fund Accounting/ client reporting/ investment operations. Hands –on in MultiFonds Fund Administration and Global Investor products or equivalent Fund Accounting Products Key Responsibilities Risk Management Proactively identify and track Obsolescence of Hardware and Software components, including OS or CVE patches, for Funds Apps and interfaces Governance Develop and Deliver as per SDF mandated process. Follow Release management standards and tools to deploy the deliverable to production. Handover to Production Support as per process Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Lead the Agile Squad to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] * Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Key stakeholders FM – Funds and Securities Services Operations, Technology and Production Support. Other Responsibilities Coordinates between Product Vendor and Business Stakeholders for requirement finalization. Coordinates between Product Vendor and Business Stakeholders for requirement finalization. Understand Functional Requirement. Provide solutions by developing the required components or reports. Unit test and support SIT and UAT. Follow SCB change management process to promote the developed components to production. Proactively input to solution design including architectural view of our technology landscape, experience on data optimisation solutions (DB table mapping, data logic, development code design, etc.) Participates in identification of non-functional requirements like security requirements, performance objectives. Coordinates between various internal support teams. Picks up new technologies with ease, solves complex technical problems and multitasks between different projects Follow SCB change management process to promote the developed components to production. Proactively input to solution design including architectural view of our technology landscape. Participates in identification of non-functional requirements like security requirements, performance objectives. Coordinates between various internal support teams. Picks up new technologies with ease, solves complex technical problems and multitasks between different projects Skills And Experience Windows/Linux WebLogic, citrix, MQ, Solace AWS/Azure Cloud native and VM concepts PL\SQL Domain Experience in Fund Accounting, Transfer Agency and (or) Hedge Fund Administration DevOps Qualifications Degree in Computer Science, MCA or Equivalent. 7 to 10 years of prior work experience in stated Technology About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Show more Show less

Posted 6 days ago

Apply

4.0 - 8.0 years

15 - 30 Lacs

Noida, Hyderabad, India

Hybrid

Naukri logo

Spark Architecture , Spark tuning, Delta tables, Madelaine architecture, data Bricks , Azure cloud services python Oops concept, Pyspark complex transformation , Read data from different file format and sources writing to delta tables Dataware housing concepts How to process large files and handle pipeline failures in current projects Roles and Responsibilities Spark Architecture , Spark tuning, Delta tables, Madelaine architecture, data Bricks , Azure cloud services python Oops concept, Pyspark complex transformation , Read data from different file format and sources writing to delta tables Dataware housing concepts How to process large files and handle pipeline failures in current projects

Posted 6 days ago

Apply

8.0 - 13.0 years

35 - 50 Lacs

Mumbai

Work from Office

Naukri logo

Hiring Big Data Lead with 8+ years experience for US Shift time: Must Have: - Big Data: Spark, Hadoop, Kafka, Hive, Flink - Backend: Python, Scala - NoSQL: MongoDB, Cassandra - Cloud: AWS/AZURE/GCP, Snowflake, Databricks - Docker, Kubernetes, CI/CD Required Candidate profile - Excellent in Mentoring/ Training in Big Data- HDFS, YARN, Airflow, Hive, Mapreduce, Hbase, Kafka & ETL/ELT, real-time streaming, data modeling - Immediate Joiner is plus - Excellent in Communication

Posted 6 days ago

Apply

4.0 - 8.0 years

10 - 20 Lacs

Pune, Bengaluru

Work from Office

Naukri logo

We are looking for skilled Hadoop and Google Cloud Platform (GCP) Engineers to join our dynamic team. If you have hands-on experience with Big Data technologies and cloud ecosystems, we want to hear from you! Key Skills: Hadoop Ecosystem (HDFS, MapReduce, YARN, Hive, Spark) Google Cloud Platform (BigQuery, DataProc, Cloud Composer) Data Ingestion & ETL pipelines Strong programming skills (Java, Python, Scala) Experience with real-time data processing (Kafka, Spark Streaming) Why Join Us? Work on cutting-edge Big Data projects Collaborate with a passionate and innovative team Opportunities for growth and learning Interested candidates, please share your updated resume or connect with us directly!

Posted 6 days ago

Apply

2.0 - 4.0 years

8 - 12 Lacs

Mumbai

Work from Office

Naukri logo

The SAS to Databricks Migration Developer will be responsible for migrating existing SAS code, data processes, and workflows to the Databricks platform. This role requires expertise in both SAS and Databricks, with a focus on converting SAS logic into scalable PySpark and Python code. The developer will design, implement, and optimize data pipelines, ensuring seamless integration and functionality within the Databricks environment. Collaboration with various teams is essential to understand data requirements and deliver solutions that meet business needs

Posted 1 week ago

Apply

0.0 - 4.0 years

3 - 7 Lacs

Pune

Work from Office

Naukri logo

Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Scala programming. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 week ago

Apply

5.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role _x000D_ Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? _x000D_ Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? _x000D_ Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? _x000D_ Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? _x000D_ Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Hadoop_x000D_. Experience5-8 Years_x000D_. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 week ago

Apply

8.0 - 13.0 years

30 - 35 Lacs

Chennai

Work from Office

Naukri logo

KC International School is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs. The DE at KC will design, develop and maintain all school data infrastructure ensuring accurate and efficient data management.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies