Home
Jobs

2384 Hive Jobs - Page 40

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

9.0 - 12.0 years

16 - 21 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Company: KPI Partners Location: Bangalore, Karnataka, India; Hyderabad, Telangana, India; Pune, Maharashtra, India Experience: 9 to 16 years Job Description: KPI Partners, a leader in providing analytics and data management solutions, is seeking a highly skilled **Senior Data Architect** to join our dynamic team. This position offers an exciting opportunity to work on innovative data solutions that drive business value for our clients. You will be responsible for designing, developing, and implementing data architectures that align with our organizational goals and client requirements. Key Responsibilities: - Lead the design and implementation of data architecture solutions, ensuring alignment with best practices and compliance standards. - Develop comprehensive data models to support different business applications and analytical needs. - Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. - Oversee the integration of SAP and other data sources into the Datasphere platform. - Create strategies for data governance, quality assurance, and data lifecycle management. - Ensure scalability and performance of data systems through efficient architecture practices. - Mentor and guide junior data professionals in data architecture and modeling best practices. - Stay updated with industry trends and emerging technologies in data architecture and analytics. Required Skills: - 9 to 12 years of experience in data architecture, data modeling, and data management. - Strong expertise in SAP systems and data integration processes. - Proficient in DataSphere or similar data management platforms. - Solid understanding of data governance, data warehousing, and big data technologies. - Excellent analytical and problem-solving abilities. - Strong communication skills to engage with technical and non-technical stakeholders. - Ability to work independently and as part of a team in a fast-paced environment. Preferred Qualifications: - Bachelor’s or master’s degree in computer science, Information Systems, or a related field. - Relevant certifications in data architecture or SAP technologies. - Experience in cloud data solutions and platform migrations is a plus. What We Offer: - Competitive salary and benefits package. - Opportunities for professional growth and development. - A collaborative work environment fostering innovation and creativity. - The chance to work with cutting-edge technology and notable clients.

Posted 1 week ago

Apply

3.0 - 5.0 years

12 - 16 Lacs

Chennai

Work from Office

Naukri logo

G6 Cloud Data Architect (Snowflake + DBT), this is for pro-active hiring

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

At least 8 + years of experience in any of the ETL tools Prophecy, Datastage 11.5/11.7, Pentaho.. etc . At least 3 years of experience in Pyspark with GCP (Airflow, Dataproc, Big query) capable of configuring data pipelines . Strong Experience in writing complex SQL queries to perform data analysis on Databases SQL server, Oracle, HIVE etc . Possess the following technical skills SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools ) Ability to work independently on specialized assignments within the context of project deliverables Take ownership of providing solutions and tools that iteratively increase engineering efficiencies . Design should help embed standard processes, systems and operational models into the BAU approach for end-to-end execution of Data Pipelines Proven problem solving and analytical abilities including the ability to critically evaluate information gathered from multiple sources, reconcile conflicts, decompose high-level information into details and apply sound business and technical domain knowledge Communicate openly and honestly. Advanced oral, written and visual communication and presentation skills - the ability to communicate efficiently at a global level is paramount. Ability to deliver materials of the highest quality to management against tight deadlines. Ability to work effectively under pressure with competing and rapidly changing priorities.

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Pune

Work from Office

Naukri logo

Should be capable of developing/configuring data pipelines in a variety of platforms and technologies Possess the following technical skills SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools) Can demonstrate strong experience in writing complex SQL queries to perform data analysis on Databases SQL server, Oracle, HIVE etc. Experience with GCP (particularly Airflow, Dataproc, Big Query) is an advantage Have experience with creating solutions which power AI/ML models and generative AI Ability to work independently on specialized assignments within the context of project deliverables Take ownership of providing solutions and tools that iteratively increase engineering efficiencies Be capable of creating designs which help embed standard processes, systems and operational models into the BAU approach for end-to-end execution of data pipelines Be able to demonstrate problem solving and analytical abilities including the ability to critically evaluate information gathered from multiple sources, reconcile conflicts, decompose high-level information into details and apply sound business and technical domain knowledge Communicate openly and honestly using sophisticated oral, written and visual communication and presentation skills - the ability to communicate efficiently at a global level is paramount Ability to deliver materials of the highest quality to management against tight deadlines Ability to work effectively under pressure with competing and rapidly changing priorities

Posted 1 week ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking for a Data Engineer with experience in data warehouse projects, strong expertise in Snowflake , and hands-on knowledge of Azure Data Factory (ADF) and dbt (Data Build Tool). Proficiency in Python scripting will be an added advantage. Key Responsibilities: Design, develop, and optimize data pipelines and ETL processes for data warehousing projects. Work extensively with Snowflake, ensuring efficient data modeling, and query optimization. Develop and manage data workflows using Azure Data Factory (ADF) for seamless data integration. Implement data transformations, testing, and documentation using dbt. Collaborate with cross-functional teams to ensure data accuracy, consistency, and security. Troubleshoot data-related issues. (Optional) Utilize Python for scripting, automation, and data processing tasks. Required Skills & Qualifications: Experience in Data Warehousing with a strong understanding of best practices. Hands-on experience with Snowflake (Data Modeling, Query Optimization). Proficiency in Azure Data Factory (ADF) for data pipeline development. Strong working knowledge of dbt (Data Build Tool) for data transformations. (Optional) Experience in Python scripting for automation and data manipulation. Good understanding of SQL and query optimization techniques. Experience in cloud-based data solutions (Azure). Strong problem-solving skills and ability to work in a fast-paced environment. Experience with CI/CD pipelines for data engineering. Why Join Us Opportunity to work on cutting-edge data engineering projects. Work with a highly skilled and collaborative team. Exposure to modern cloud-based data solutions. ------ ------Developer / Software Engineer - One to Three Years,Snowflake - One to Three Years------PSP Defined SCU in Solution Architect

Posted 1 week ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Managing Consultant - Analytics Managing Consultant – Performance Analytics Advisors & Consulting Services Services within Mastercard is responsible for acquiring, engaging, and retaining customers by managing fraud and risk, enhancing cybersecurity, and improving the digital payments experience. We provide value-added services and leverage expertise, data-driven insights, and execution. Our Advisors & Consulting Services team combines traditional management consulting with Mastercard’s rich data assets, proprietary platforms, and technologies to provide clients with powerful strategic insights and recommendations. Our teams work with a diverse global customer base across industries, from banking and payments to retail and restaurants. The Advisors & Consulting Services group has five specializations: Strategy & Transformation, Performance Analytics, Business Experimentation, Marketing, and Program Management. Our Performance Analytics consultants translate data into insights by leveraging Mastercard and customer data to design, implement, and scale analytical solutions for customers. They use qualitative and quantitative analytical techniques and enterprise applications to synthesize analyses into clear recommendations and impactful narratives. Positions for different specializations and levels are available in separate job postings. Please review our consulting specializations to learn more about all opportunities and apply for the position that is best suited to your background and experience: https://careers.mastercard.com/us/en/consulting-specializations-at-mastercard Roles and Responsibilities Client Impact Lead client engagements across a range of industries and problem statements Develop analytics strategies and programs for large, regional, and global clients by leveraging data and technology solutions to unlock client value Own key relationships with mid-level to senior client stakeholders and independently assess client agenda, internal culture, and change readiness Team Collaboration & Culture Lead team to creative insights and sound business recommendations, and deliver impactful client presentations while growing team members’ roles and skills Provide analytical and day-to-day project delivery team leadership, and create a collaborative and inclusive environment for all levels Collaborate with internal Mastercard stakeholders including Product and Business Development to scope projects, create relevant solutions for clients, and build the firm's intellectual capital Provide on-the-job training, coaching, and mentorship to junior consultants Qualifications Basic qualifications Undergraduate degree with data and analytics experience in business intelligence and/or descriptive, predictive, or prescriptive analytics Experience coaching and managing teams across multiple projects Experience managing key client relationships Knowledge of business KPIs, financials and organizational leadership Ability to identify new business development opportunities, and experience drafting proposals and scoping new opportunities Analytical, interpretive, and problem-solving skills, including the proven ability to analyze large amounts of data and synthesize key findings and recommendations Data and analytics experience such as working with data analytics software (e.g., Python, R, SQL, SAS), building, managing, and maintaining database structures, working with data visualization tools (e.g., Tableau, Power BI) Advanced Word, Excel, and PowerPoint skills Ability to manage multiple tasks and clients in a fast-paced, deadline-driven environment Ability to communicate effectively in English and the local office language (if applicable) Eligibility to work in the country where you are applying, as well as apply for travel visas as required by travel needs Preferred Qualifications Additional data and analytics experience in Hadoop framework and coding using Impala, Hive, or PySpark Experience generating new knowledge or creating innovative solutions for a firm Relevant industry expertise Master’s degree with relevant specialization such as advanced analytics, big data, or mathematical discipline (not required) For this role, specific focus on experience with risk, including credit risk, fraud in payments and authorisation of card payments. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-236035 Show more Show less

Posted 1 week ago

Apply

1.0 - 4.0 years

3 - 6 Lacs

Pune

Hybrid

Naukri logo

Must have skills required : Python, R, SQL, PowerBI, Spotfire, Hadoop, Hive Good to have skills : Spark, Statistics, Big Data Job Description: You will work with Being part of a digital delivery data group supporting bp Solutions, you will apply your domain knowledge and familiarity with domain data processes to support the organisation. Part of bps Production & Operations business, bp Solutions has hubs in London, Pune, and Houston. The data team provides daily operational data management, data engineering and analytics support to this organisation across a broad range of activity from facilities and subsea engineering to logistics. Let me tell you about the role A data analyst collects, processes, and performs analyses on a variety of datasets. Their key responsibilities include interpreting complex data sets to identify trends and patterns, using analytical tools and methods to generate actionable insights, and creating visualizations and reports to communicate those insights and recommendations to support decision-making. Data analysts collaborate closely with business domain stakeholders to understand their data analysis needs, ensure data accuracy, write and recommend data-driven solutions and solve value impacting business problems. You might be a good fit for this role if you: have strong domain knowledge in at least one of; facilities or subsea engineering, maintenance and reliability, operations, logistics. Strong analytical skills and demonstrable capability in applying analytical techniques and Python scripting to solve practical problems. are curious, and keen to apply new technologies, trends & methods to improve existing standards and the capabilities of the Subsurface community. are well organized and self-motivated, you balance proactive and reactive approaches and across multiple priorities to complete tasks on time. apply judgment and common sense you use insight and good judgment to inform actions and respond to situations as they arise. What you will deliver Be a bridge between asset teams and Technology, combining in-depth understanding of one or more relevant domains with data & analytics skills Provide actionable, data-driven insights by combining deep statistical skills, data manipulation capabilities and business insight. Proactively identify impactful opportunities and autonomously complete data analysis. You apply existing data & analytics strategies relevant to your immediate scope. Clean, pre-process and analyse both structured and unstructured data Develop data visualisations to analyse and interrogate broad datasets (e.g. with tools such as Microsoft PowerBI, Spotfire or similar). Present results to peers and senior management, influencing decision making What you will need to be successful (experience and qualifications) Essential MSc or equivalent experience in a quantitative field, preferably statistics. have strong domain knowledge in at least one of; facilities or subsea engineering, maintenance and reliability, operations, logistics. Hands-on experience carrying out data analytics, data mining and product analytics in complex, fast-paced environments. Applied knowledge of data analytics and data pipelining tools and approaches across all data lifecycle stages. Deep understanding of a few and a high-level understanding of several commonly available statistics approaches. Advanced SQL knowledge. Advanced scripting experience in R or python. Ability to write and maintain moderately complex data pipelines. Customer-centric and pragmatic mindset. Focus on value delivery and swift execution, while maintaining attention to detail. Excellent communication and interpersonal skills, with the ability to effectively communicate ideas, expectations, and feedback to team members, stakeholders, and customers. Foster collaboration and teamwork Desired Advanced analytics degree. Experience applying analytics to support engineering turnarounds Experience with big data technologies (e.g. Hadoop, Hive, and Spark) is a plus.

Posted 1 week ago

Apply

6.0 - 14.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Skill: - Pyspark Experience: 6 to 14 years Location: - Kochi (Walkin on 14th Jun) Design, develop, maintain efficient and scalable solutions using PySpark Ensure data quality and integrity by implementing robust testing, validation and cleansing processes Integrate data from various sources, including databases, APIs, external datasets etc. Optimize and tune PySpark jobs for performance and reliability Document data engineering processes, workflows and best practices Strong understanding of databases, data modelling, and ETL tools and processes String programming skills in python and proficiency with PySpark, SQL Experience with relational databases, Hadoop, Spark, Hive, Impala Excellent communication and collaboration skills Show more Show less

Posted 1 week ago

Apply

170.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Summary The role is accountable for tactical and operational support for production services, across one or more areas of specific platform/domain. To ensure maximum service quality and stability through fast and effective response to technical incidents, and to be a catalyst for change via analysis and identification of continual service improvement opportunities. Depending on the area of technical specialisation, in addition to incident resolution and prevention, it may also be involved in a control capacity to ensure that new changes to the technology estate do not introduce instability. Manage technical resumption of high priority, S@R, medium/high severity incidents, provide end-to-end support and implement resolution to resolve incidents within SLA Provide root cause analysis for S@R, medium/high severity issues, ensure all follow up action points are carried out Responsible for the stability of the production system. Direct second and third level of support for problem diagnosis and resolution as per the agreed SLA's. Responsible for managing the production related changes, releases and rollouts with zero or minimal impact to the stability of the application. Review the dependent changes of the surround systems, infrastructure, networking etc... Responsible for ensuring proper technical plans are in place for all production changes (e.g. fallback plan, implementation plan, data conversion etc...) Create and update Production Support documentation, contingency (DR/BCP) documentation and processes. Provide inputs to PSS manager for monthly dashboard that provide information on incident and problem trends along with SIP and RCA Action Items. Participate & support in cross-training and knowledge transfer activities within support teams Key Responsibilities Strategy To be accountable to execute the strategy devised for the business unit Business Fully accountable in incident, problem, change and risk management which relates to the production application/system. Processes Create, Review and update Production Support documentation. Update of contingency (DR/BCP) documentation and processes People & Talent Participate in cross-training and knowledge transfer activities within support teams Risk Management Responsible to proactively identify the risks in the application and manage the mitigation actions. Responsible for managing, tracking and timely closure of risks and other compliance related issues in Riskwise (Information Security risks) & M7 (Operational Risks). Governance Provide inputs to management for monthly dashboard that provide information on incident and problem trends along with SIP and RCA Action Items. Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Lead to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] * Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Key stakeholders SRE PO Hive PO SRE Lead Business Stakeholders CIO counterparts Other Responsibilities Embed Here for good and Group’s brand and values in the Support team; Perform other responsibilities assigned under Group, Country, Business or Functional policies and procedures; Multiple functions (double hats); Skills And Experience AWS Oracle Linux Kubernetes API Qualifications AWS Certification SRE Certification ITIL Certification (Good to have) Competencies Action Oriented Collaborates Customer Focus Gives Clarity & Guidance Manages Ambiguity Develops Talent Drives Vision & Purpose Nimble Learning Decision Quality Courage Instills Trust Strategic Mindset Technical Competencies: This is a generic competency to evaluate candidate on role-specific technical skills and requirements About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Recruitment Assessments Some of our roles use assessments to help us understand how suitable you are for the role you've applied to. If you are invited to take an assessment, this is great news. It means your application has progressed to an important stage of our recruitment process. Visit our careers website www.sc.com/careers Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Design, develop, and optimize data pipelines and ETL processes using PySpark or Scala to extract, transform, and load large volumes of structured and unstructured data from diverse sources. Implement data ingestion, processing, and storage solutions on GCP cloud platform, leveraging services. Develop and maintain data models, schemas, and metadata to support efficient data access, query performance, and analytics requirements. Monitor pipeline performance, troubleshoot issues, and optimize data processing workflows for scalability, reliability, and cost-effectiveness. Implement data security and compliance measures to protect sensitive information and ensure regulatory compliance. Requirement Proven experience as a Data Engineer, with expertise in building and optimizing data pipelines using PySpark, Scala, and Apache Spark. Hands-on experience with cloud platforms, particularly GCP, and proficiency in GCP services. Strong programming skills in Python and Scala, with experience in software development, version control, and CI/CD practices. Familiarity with data warehousing concepts, dimensional modeling, and relational databases (e.g., SQL Server, PostgreSQL, MySQL). Experience with big data technologies and frameworks (e.g., Hadoop, Hive, HBase) is a plus. Mandatory Skill Sets GCP, Pyspark, Spark Preferred Skill Sets GCP, Pyspark, Spark Years Of Experience Required 4 - 8 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Good Clinical Practice (GCP) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 12 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description AWS Fintech team is looking for a Data Engineering Manager to transform and optimize high-scale, world class financial systems that power the global AWS business. The success of these systems will fundamentally impact the profitability and financial reporting for AWS and Amazon. This position will play an integral role in leading programs that impact multiple AWS cost optimization initiatives. These programs will involve multiple development teams across diverse organizations to build sophisticated, highly reliable financial systems. These systems enable routine finance operations as well as machine learning, analytics, and GenAI reporting that enable AWS Finance to optimize profitability and free cash flow. This position requires a proactive, highly organized individual with an aptitude for data-driven decision making, a deep curiosity for learning new systems, and collaborative skills to work with both technical and financial teams. Key job responsibilities Build and lead a team of data engineers, application development engineers, and systems development engineers Drive execution of data engineering programs and projects Help our leadership team make challenging decisions by presenting well-reasoned and data-driven solution proposals and prioritizing recommendations. Identify and execute on opportunities for our organization to move faster in delivering innovations to our customers. This role has oncall responsibilities. A day in the life The successful candidate will build and grow a high-performing data engineering team to transform financial processes at Amazon. The candidate will be curious and interested in the capabilities of Large Language Model-based development tools like Amazon Q to help teams accelerate transformation of systems. The successful candidate will begin with execution to familiarize themselves with the space and then construct a strategic roadmap for the team to innovate. You thrive and succeed in an entrepreneurial environment, and are not hindered by ambiguity or competing priorities. You thrive driving strategic initiatives and also dig in deep to get the job done. About The Team The AWS FinTech team enables the growth of earth’s largest cloud provider by building world-class finance technology solutions for effective decision making. We build scalable long-term solutions that provide transparency into financial business insights while ensuring the highest standards of data quality, consistency, and security. We encourage a culture of experimentation and invest in big ideas and emerging technologies. We are a globally distributed team with software development engineers, data engineers, application developers, technical program managers, and product managers. We invest in providing a safe and welcoming environment where inclusion, acceptance, and individual values are honored. Basic Qualifications Experience managing a data or BI team 2+ years of processing data with a massively parallel technology (such as Redshift, Teradata, Netezza, Spark or Hadoop based big data solution) experience 2+ years of relational database technology (such as Redshift, Oracle, MySQL or MS SQL) experience 2+ years of developing and operating large-scale data structures for business intelligence analytics (using ETL/ELT processes) experience 5+ years of data engineering experience Experience communicating to senior management and customers verbally and in writing Experience leading and influencing the data or BI strategy of your team or organization Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS Preferred Qualifications Knowledge of software development life cycle or agile development environment with emphasis on BI practices Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with AWS Tools and Technologies (Redshift, S3, EC2) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A2961772 Show more Show less

Posted 1 week ago

Apply

4.0 - 9.0 years

8 - 18 Lacs

Bengaluru

Hybrid

Naukri logo

We have an immediate opening for Big Data Developer with Encora Innovation Labs in Bangalore. Exp: 4 to 8 Yrs Location : Bangalore (Hybrid) Budget: Not a constraint for right candidate Job Description: Spark and Scala Hive, Hadoop Strong communication skills If interested, please revert with your updated resume and passport size photo along with below mentioned details. Total Exp: Rel Exp: CTC: ECTC: Notice Period (Immediate to 15 Days): Current Location: Preferred Location: Any offers in Han

Posted 1 week ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector FS X-Sector Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Job location: Bangalore Total experience: 6 to 8 years Job Description  Languages: Scala/Python 3.x  File System: HDFS  Frameworks: Spark 2.x/3.x (Batch/SQL API), Hadoop, Oozie/Airflow  Databases: HBase, Hive, SQL Server, Teradata  Version Control System: GitHub  Other Tools: Zendesk, JIRA Mandatory Skill Set-Scala/Python Preferred Skill Set-Scala/Python Year of experience required-5+ Qualifications-Btech Education (if blank, degree and/or field of study not specified) Degrees/Field Of Study Required Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Python (Programming Language) Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 week ago

Apply

1.0 years

4 - 6 Lacs

Hyderābād

On-site

GlassDoor logo

- 1+ years of data engineering experience - Experience with SQL - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) Do you want to be a leader in the team that takes Transportation and Retail models to the next generation? Do you have a solid analytical thinking, metrics driven decision making and want to solve problems with solutions that will meet the growing worldwide need? Then Transportation is the team for you. We are looking for top notch Data Engineers to be part of our world class Business Intelligence for Transportation team. • 4-7 years of experience performing quantitative analysis, preferably for an Internet or Technology company • Strong experience in Data Warehouse and Business Intelligence application development • Data Analysis: Understand business processes, logical data models and relational database implementations • Expert knowledge in SQL. Optimize complex queries. • Basic understanding of statistical analysis. Experience in testing design and measurement. • Able to execute research projects, and generate practical results and recommendations • Proven track record of working on complex modular projects, and assuming a leading role in such projects • Highly motivated, self-driven, capable of defining own design and test scenarios • Experience with scripting languages, i.e. Perl, Python etc. preferred • BS/MS degree in Computer Science • Evaluate and implement various big-data technologies and solutions (Redshift, Hive/EMR, Tez, Spark) to optimize processing of extremely large datasets in an accurate and timely fashion. Experience with large scale data processing, data structure optimization and scalability of algorithms a plus Key job responsibilities 1. Responsible for designing, building and maintaining complex data solutions for Amazon's Operations businesses 2. Actively participates in the code review process, design discussions, team planning, operational excellence, and constructively identifies problems and proposes solutions 3. Makes appropriate trade-offs, re-use where possible, and is judicious about introducing dependencies 4. Makes efficient use of resources (e.g., system hardware, data storage, query optimization, AWS infrastructure etc.) 5. Knows about recent advances in distributed systems (e.g., MapReduce, MPP Architectures, External Partitioning) 6. Asks correct questions when data model and requirements are not well defined and comes up with designs which are scalable, maintainable and efficient 7. Makes enhancements that improve team’s data architecture, making it better and easier to maintain (e.g., data auditing solutions, automating, ad-hoc or manual operation steps) 8. Owns the data quality of important datasets and any new changes/enhancements Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

0 years

2 - 2 Lacs

Gurgaon

On-site

GlassDoor logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Senior Data Scientist Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships, and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Our Team: As consumer preference for digital payments continues to grow, ensuring a seamless and secure consumer experience is top of mind. Optimization Solutions team focuses on tracking of digital performance across all products and regions, understanding the factors influencing performance and the broader industry landscape. This includes delivering data-driven insights and business recommendations, engaging directly with key external stakeholders on implementing optimization solutions (new and existing), and partnering across the organization to drive alignment and ensure action is taken. Are you excited about Data Assets and the value they bring to an organization? Are you an evangelist for data-driven decision-making? Are you motivated to be part of a team that builds large-scale Analytical Capabilities supporting end users across 6 continents? Do you want to be the go-to resource for data science & analytics in the company? The Role: Work closely with global optimization solutions team to architect, develop, and maintain advanced reporting and data visualization capabilities on large volumes of data to support data insights and analytical needs across products, markets, and services The candidate for this position will focus on Building solutions using Machine Learning and creating actionable insights to support product optimization and sales enablement. Prototype new algorithms, experiment, evaluate and deliver actionable insights. Drive the evolution of products with an impact focused on data science and engineering. Designing machine learning systems and self-running artificial intelligence (AI) software to automate predictive models. Perform data ingestion, aggregation, and processing on high volume and high dimensionality data to drive and enable data unification and produce relevant insights. Continuously innovate and determine new approaches, tools, techniques & technologies to solve business problems and generate business insights & recommendations. Apply knowledge of metrics, measurements, and benchmarking to complex and demanding solutions. All about You A superior academic record at a leading university in Computer Science, Data Science, Technology, mathematics, statistics, or a related field or equivalent work experience Experience in data management, data mining, data analytics, data reporting, data product development and quantitative analysis Strong analytical skills with track record of translating data into compelling insights Prior experience working in a product development role. knowledge of ML frameworks, libraries, data structures, data modeling, and software architecture. proficiency in using Python/Spark, Hadoop platforms & tools (Hive, Impala, Airflow, NiFi), and SQL to build Big Data products & platforms Experience with Enterprise Business Intelligence Platform/Data platform i.e. Tableau, PowerBI is a plus. Demonstrated success interacting with stakeholders to understand technical needs and ensuring analyses and solutions meet their needs effectively. Ability to build a strong narrative on the business value of products and actively participate in sales enablement efforts. Able to work in a fast-paced, deadline-driven environment as part of a team and as an individual contributor. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.

Posted 1 week ago

Apply

175.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. Overview of the Business: The Global Services (GS) is comprised of several interconnected business units which collectively provide service on a global scale, playing a central role in helping American Express achieve its vision of being the world’s most respected service brand. As part of GS, the Global Servicing Enablement (GSE) team is responsible for process design & engineering, capacity management, governance, analytics, value generation and learning across GSG. The team is accountable for providing unwavering support to all our Customer Care Professionals and Specialists who serve our customers globally every day. GSE is also responsible for the Enterprise Complaint Center of Excellence chartered with ensuring American Express has a robust complaints management program. The position is in Global Planning and Contact Optimization(GPCO) . GPCO is responsible for Capacity/staff planning, and real time performance management & monitoring for GS across various markets globally. The group executes plans built by the Forecasting & Business Planning teams and manages 24/7 real-time performance in the voice and digital channels. The group ensures that robust schedules are designed to meet the demand of daily operations. The schedules are aligned to intraday/intraweek call volume distributions for all markets and lines of business. As a part of the team, you will be responsible for the following: Intra-day call type/segment performance management Shrinkage / off the phone activities management, Intraday Schedules management and recommend schedule changes basis business requirement Execution of Service Code Alert strategies Centralized contact for operations leaders for real time business performance management Work with Short Term Forecasting Team for Intra Day Performance (IDP) & Staffing outlook Communicate systems, voice response & telecommunication issues to the relevant teams Proactively identify process improvement opportunities Maintain strong relationships with the operation’s leaders to improve overall understanding and awareness of daily/weekly business impacts Shift Rotations: 24 *7 Minimum Qualifications Functional skills: Bachelor’s degree (Economics / Mathematics / Statistics/ Data Analytics); MBA or equivalent is a plus 2+ years of relevant experience in Workforce real time management/ Operations/MIS analytics would be preferred Proficiency in Workforce Management Tools such as Avaya, eWFM, Genesys/ConneX as well as understanding of call center volume drivers and forecasting/workforce planning processes would be an added advantage Strong written and verbal communication skills with demonstrated success in creating and conducting presentations to large / senior / challenging audiences, a plus Strong organizational and project management skills Proven ability to manage multiple priorities effectively with a track record of driving results effectively while meeting deadlines Strong relationship and collaboration skills, including the ability to work in a highly matrixed environment Behavioral Skills/Capabilities: Delivers high quality work with direction and oversight Understands work goals and seeks to understand its importance to the Business Feels comfortable taking decisions/ calculated risks based on facts and intuition Flexible to quickly adjust around shifting priorities, multiple demands, ambiguity, and rapid change Maintains a positive attitude when presented with a barrier Demonstrated ability to challenge the status quo & build consensus Effective team player with a high level of integrity Technical Skills/ Knowledge of platforms: Proficiency with Microsoft Office, especially Excel, and PowerPoint Project management skills, knowledge and experience of successfully leading projects, a plus Ability to handle large data sets & prior programming experience in SAS, SQL, Python and/or HQL (Hive Query Language) to write codes independently and efficiently will be useful Knowledge of machine learning will be an added advantage Exposure to Big Data Platforms such Cornerstone & visualization tools such Tableau, a nice to have We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 1 week ago

Apply

6.0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Linkedin logo

Assistant Manager with 6-7 yers expereince for Battery/Cell Modeling Engineer role Lead and mentor a team of data scientists and analysts working on vehicle telemetry, usage, and performance data. Write and optimize complex SQL queries to analyze large-scale, time-series EV telemetry datasets (e.g., battery voltage/current, temperature, SOH, motor efficiency, torque, speed, trip patterns). Develop diagnostic algorithms and SOH estimation frameworks using real-world data to monitor and predict subsystem health (battery, motor, thermal). Translate raw data into high-impact visualizations, KPIs, and trend analyses to support system simulations and control strategy tuning. Collaborate with 1D system simulation engineers to correlate model-based results with real-world behavior and suggest model improvements or boundary conditions. Partner with requirement engineers and product managers to define data-backed feature requirements and validate them through in-field performance metrics. Build dashboards and pipelines for field failure diagnostics, early-warning systems, and OTA update validations. Interface with firmware, controls, and thermal teams to identify improvement areas based on field usage patterns. Design A/B experiments or cohort studies to measure the impact of software updates, control calibrations, and new features on vehicle performance. Must Have: Strong SQL skills; experience with distributed data processing (e.g., Spark, Hive, Presto) is a plus. Proficient in Python (Pandas, NumPy, Matplotlib, scikit-learn); experience in applying ML techniques to engineering datasets is advantageous. Deep domain knowledge of EV subsystems: battery pack behavior, motor drive efficiency, thermal limits, and SOH/SOC estimation. Experience working with simulation or modeling teams (1D/0D tools like Simulink, AMESim, GT-Suite, etc.) to validate or enhance model accuracy using field data. Proven leadership experience in a cross-functional setup. Must Have: Bachelor's or Master’s degree in Data Science, Computer Science, Electrical/Mechanical Engineering, or a related field. 6+ years of experience in data analytics or data science, preferably with time-series/telemetry data in automotive, energy, or mobility domains. Good To Have: PhD/Master’s degree in Data Science, Computer Science, Electrical/Mechanical Engineering, or a related field. 4+ years of experience in data analytics or data science, preferably with time-series/telemetry data in automotive, energy, or mobility domains. Show more Show less

Posted 1 week ago

Apply

5.0 - 8.0 years

2 - 3 Lacs

Chennai

On-site

GlassDoor logo

Job Description The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology/MCA 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data : Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management : Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design : Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages : Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps : Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio : Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud : Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls : Exposure to data validation, cleansing, enrichment and data controls Containerization : Fair understanding of containerization platforms like Docker, Kubernetes File Formats : Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others : Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. - Job Family Group: Technology - Job Family: Digital Software Engineering - Time Type: - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 week ago

Apply

6.0 - 12.0 years

8 - 10 Lacs

Chennai

On-site

GlassDoor logo

Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities, and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* The Analytics and Intelligence Engine (AIE) team transforms analytical and operational data into Consumer and Wealth Client insights and enables personalization opportunities that are provided to Associate and Customer-facing operational applications. The Big data technologies used in this are Hadoop /PySpark / Scala, HQL as ETL, Unix as file Landing environment, and real time (or near real time) streaming applications. Job Description* We are actively seeking a talented and motivated Senior Hadoop Developer/ Lead to join our dynamic and energetic team. As a key contributor to our agile scrum teams, you will collaborate closely with the Insights division. We are looking for a candidate who can showcase strong technical expertise in Hadoop and related technologies, and who excels at collaborating with both onshore and offshore team members. The role requires both hands-on coding and collaboration with stakeholders to drive strategic design decisions. While functioning as an individual contributor for one or more teams, the Senior Hadoop Data Engineer may also have the opportunity to lead and take responsibility for end-to-end solution design and delivery, based on the scale of implementation and required skillsets. Responsibilities* Develop high-performance and scalable solutions for Insights, using the Big Data platform to facilitate the collection, storage, and analysis of massive data sets from multiple channels. Utilize your in-depth knowledge of Hadoop stack and storage technologies, including HDFS, Spark, Scala, MapReduce, Yarn, Hive, Sqoop, Impala, Hue, and Oozie, to design and optimize data processing workflows. Implement Near real-time and Streaming data solutions to provide up-to-date information to millions of Bank customers using Spark Streaming, Kafka. Collaborate with cross-functional teams to identify system bottlenecks, benchmark performance, and propose innovative solutions to enhance system efficiency. Take ownership of defining Big Data strategies and roadmaps for the Enterprise, aligning them with business objectives. Apply your expertise in NoSQL technologies like MongoDB, SingleStore, or HBase to efficiently handle diverse data types and storage requirements. Stay abreast of emerging technologies and industry trends related to Big Data, continuously evaluating new tools and frameworks for potential integration. Provide guidance and mentorship to junior teammates. Requirements* Education* Graduation / Post Graduation: BE/B.Tech/MCA Certifications If Any: NA Experience Range* 6 to 12 Years Foundational Skills* Minimum of 7 years of industry experience, with at least 5 years focused on hands-on work in the Big Data domain. Highly skilled in Hadoop stack technologies, such as HDFS, Spark, Hive, Yarn, Sqoop, Impala and Hue. Strong proficiency in programming languages such as Python, Scala, and Bash/Shell Scripting. Excellent problem-solving abilities and the capability to deliver effective solutions for business-critical applications. Strong command of Visual Analytics Tools, with a focus on Tableau. Desired Skills* Experience in Real-time streaming technologies like Spark Streaming, Kafka, Flink, or Storm. Proficiency in NoSQL technologies like HBase, MongoDB, SingleStore, etc. Familiarity with Cloud Technologies such as Azure, AWS, or GCP. Working knowledge of machine learning algorithms, statistical analysis, and programming languages (Python or R) to conduct data analysis and develop predictive models to uncover valuable patterns and trends. Proficiency in Data Integration and Data Security within the Hadoop ecosystem, including knowledge of Kerberos. Work Timings* 12:00 PM to 09.00 PM IST. Job Location* Chennai, Mumbai

Posted 1 week ago

Apply

2.0 - 4.0 years

2 - 3 Lacs

Chennai

On-site

GlassDoor logo

The Data Science Analyst 2 is a developing professional role. Applies specialty area knowledge in monitoring, assessing, analyzing and/or evaluating processes and data. Identifies policy gaps and formulates policies. Interprets data and makes recommendations. Researches and interprets factual information. Identifies inconsistencies in data or results, defines business issues and formulates recommendations on policies, procedures or practices. Integrates established disciplinary knowledge within own specialty area with basic understanding of related industry practices. Good understanding of how the team interacts with others in accomplishing the objectives of the area. Develops working knowledge of industry practices and standards. Limited but direct impact on the business through the quality of the tasks/services provided. Impact of the job holder is restricted to own team. Responsibilities: The Data Engineer is responsible for building Data Engineering Solutions using next generation data techniques. The individual will be working with tech leads, product owners, customers and technologists to deliver data products/solutions in a collaborative and agile environment. Responsible for design and development of big data solutions. Partner with domain experts, product managers, analyst, and data scientists to develop Big Data pipelines in Hadoop Responsible for moving all legacy workloads to cloud platform Work with data scientist to build Client pipelines using heterogeneous sources and provide engineering services for data science applications Ensure automation through CI/CD across platforms both in cloud and on-premises Define needs around maintainability, testability, performance, security, quality and usability for data platform Drive implementation, consistent patterns, reusable components, and coding standards for data engineering processes Convert SAS based pipelines into languages like PySpark, Scala to execute on Hadoop, Snowflake and non-Hadoop ecosystems Tune Big data applications on Hadoop, Cloud and non-Hadoop platforms for optimal performance Applies in-depth understanding of how data analytics collectively integrate within the sub-function as well as coordinates and contributes to the objectives of the entire function. Produces detailed analysis of issues where the best course of action is not evident from the information available, but actions must be recommended/taken. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 2-4 years of total IT experience Experience with Hadoop (Cloudera)/big data technologies /Cloud/AI tools Hands-on experience with HDFS, MapReduce, Hive, Impala, Spark, Kafka, Kudu, Kubernetes, Dashboard tools, Snowflake builts, AWS tools, AI/ML libraries and tools, etc) Experience on designing and developing Data Pipelines for Data Ingestion or Transformation. System level understanding - Data structures, algorithms, distributed storage & compute tools, SQL expertise, Shell scripting, Schedule tools, Scrum/Agile methodologies. Can-do attitude on solving complex business problems, good interpersonal and teamwork skills Education: Bachelor’s/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. - Job Family Group: Technology - Job Family: Data Science - Time Type: Full time - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 week ago

Apply

170.0 years

0 Lacs

Chennai

On-site

GlassDoor logo

Job ID: 30687 Location: Chennai, IN Area of interest: Technology Job type: Regular Employee Work style: Office Working Opening date: 2 Jun 2025 Job Summary Product Specialists are the drivers of solution to the business problems, opportunities to automate a Business Process, offer solution to new product/services & crucial for the product development process, from concept to production along with the performance of the developed product / function in all aspects. With their commitment to innovation and knowledge in T24, these analysts seek, develop, and help implement strategic initiatives for improved efficiency and productivity. This person should also be wholly committed to the development of innovative, reliable functions in an ever-changing digital landscape in banking. Key Responsibilities Responsible for the creation and improvement of product, T24 business functions to ensure business requirements, pain points are addressed through T24 native product solution. Defining product specifications, features and performance requirements based on business requirements. Drive awareness of tech specifications, requirement & solution across squad members, & testing counterparts. Ensure compliance with Group standards like coding standards, quality standards, deployment standards, release management process, etc., Co-ordinate with cross hives according to the business needs, like Production Engineering, Interface Integration, Downstream systems, Change Governance, etc., Develop meaningful and lasting relationships with stakeholders to create value out it to the group Collaborate with QA member and provide thorough quality assurance at every stage of Agile systems development. Strategy Ability to work independently and as part of a team. A creative mindset and a passion for innovation and continuous improvement. Excellent communication and collaboration skills to work effectively with cross-functional teams. Keep the long-term vision of the product in mind while addressing short-term goals and milestones. Continuously evaluate the app's performance and user engagement metrics to ensure alignment with strategic objectives. Develop apps with scalability and performance in mind. Anticipate future growth and ensure that the app architecture and codebase can accommodate it Drive a modern approach to product delivery excellence, ensuring fast, frequent, and high-quality value delivery. Enforce and streamline sound development practices. Establish and maintain effective governance processes including KT, training, advice, and support, to assure the classic pay product is developed, implemented, and maintained aligning with the Group’s standards. Business Experienced practitioner and hands on contribution to the squad delivery for their craft (Eg. Engineering). Responsible for balancing skills and capabilities across teams (squads) and hives in partnership with the Chief Product Owner & Hive Leadership, and in alignment with the fixed capacity model. Responsible to evolve the craft towards improving automation, simplification and innovative use of latest market trends. Trusted advisor to the business. Work hand in hand with the Business, taking product programs from investment decisions, into design, specification, and solution phases, all the way to operations on the ground and securing support services from other teams. Provide leadership and technical expertise for the subdomain to achieve goals and outcomes Support respective businesses in the commercialisation of capabilities, bid teams, monitoring of usage, improving client experience, and collecting defects for future improvements. Manage business partner expectations. Ensure delivery to business meeting time, cost and with high quality Processes Adopt and embed the eSDLC process, Change Delivery Standards throughout the lifecycle of the product / service. Follows the chapter operating model to ensure a system exists to continue to build capability and performance of the chapter. Define standards to ensure that applications are designed with scale, resilience, and performance in mind Enforce and streamline sound development practices and establish and maintain effective governance processes including training, advice, and support, to assure the platforms are developed, implemented, and maintained aligning with the Group’s standards Responsible for overall governance of the subdomain that includes financial management, risk management, representation in steering committee reviews and engagement with business for strategy, change management and timely course correction as required Ensure compliance to the highest standards of business conduct, regulatory requirements and practices defined by internal and external requirements. This includes compliance with local banking laws and anti-money laundering stipulations People & Talent Accountable for people management and capability development of their Chapter members. Reviews metrics on capabilities and performance across their area, has improvement backlog for their Chapters and drives continual improvement of their chapter. Focuses on the development of people and capabilities as the highest priority. Ensure that the organisation works in a proactive way to upgrade capacity well in advance and predict future capacity needs Responsible for building an engineering culture where application and infrastructure scalability is paramount for on-going capacity management with an aim to reduce the need for capacity reviews using monitoring and auto-scale properties Empower the engineers so that they can provide economy of scale focused on delivering value, speed to market, availability, monitoring & system management Foster a culture of innovation, transparency, and accountability end to end in the subdomain while promoting a “business-first” mentality at all levels Risk Management Responsible for effective capacity risk management across the Chapter with regards to attrition and leave plans. Ensures the chapter follows the standards with respect to risk management as applicable to their chapter domain. Adheres to common practices to mitigate risk in their respective domain Effectively and collaboratively identify, escalate, mitigate, and resolve risk, conduct and compliance matters. Ensure that the organisation works in a proactive way to upgrade capacity well in advance and predict future capacity needs Responsible for building an engineering culture where application and infrastructure scalability is paramount for on-going capacity management with an aim to reduce the need for capacity reviews using monitoring and auto-scale properties Empower the engineers so that they can provide economy of scale focused on delivering value, speed to market, availability, monitoring & system management Foster a culture of innovation, transparency, and accountability end to end in the subdomain while promoting a “business-first” mentality at all levels Develop and maintain a plan that provides for succession and continuity in the most critical delivery and management position Governance Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Lead the WRB T24 Platform dev team of a squad to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] * Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Serve as a Director of the Board Exercise authorities delegated by the Board of Directors and act in accordance with Articles of Association (or equivalent) Skills and Experience T24 At least three of the T24 modules viz- Securities(equities, funds, FI etc), Derivatives, AA Loans, Deposits, Core modules of T24 Analytical & reasoning abilities Ability to manage stakeholders and user testing Communication skills, with an ability to translate into actionable insights Go-getter, Solution oriented, open for challenges, embrace dynamic & diverse work culture, own/perform work like an entrepreneur Qualifications Bachelor’s degree (or equivalent) in information technology or computer science or business administration, finance Experience in Agile environment Able to read T24 based code, perform code/impact analysis, perform SQL querying and if needed perform T24 based coding. Ability to create scalable and reliable infobasic codes for T24 Product functions. Ability to handle performance issues and performance improvements in T24 instances & Codes. Experience in DevOps, TAFJ runtime configurations & navigations, Integration methodologies preferred. Preferred to have T24 migration / upgrade experience. Preferred to have exposure to T24 Securities, AA, Lending products. Competencies Action Oriented Collaborates Customer Focus Gives Clarity & Guidance Manages Ambiguity Develops Talent Drives Vision & Purpose Nimble Learning Decision Quality Courage Instills Trust Strategic Mindset Technical Competencies: This is a generic competency to evaluate candidate on role-specific technical skills and requirements About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together we: Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What we offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Recruitment Assessments Some of our roles use assessments to help us understand how suitable you are for the role you've applied to. If you are invited to take an assessment, this is great news. It means your application has progressed to an important stage of our recruitment process. Visit our careers website www.sc.com/careers www.sc.com/careers

Posted 1 week ago

Apply

170.0 years

5 - 9 Lacs

Chennai

On-site

GlassDoor logo

Job ID: 30635 Location: Chennai, IN Area of interest: Technology Job type: Regular Employee Work style: Office Working Opening date: 2 Jun 2025 Job Summary The role is accountable for tactical and operational support for production services, across one or more areas of specific platform/domain. To ensure maximum service quality and stability through fast and effective response to technical incidents, and to be a catalyst for change via analysis and identification of continual service improvement opportunities. Depending on the area of technical specialisation, in addition to incident resolution and prevention, it may also be involved in a control capacity to ensure that new changes to the technology estate do not introduce instability. Manage technical resumption of high priority, S@R, medium/high severity incidents, provide end-to-end support and implement resolution to resolve incidents within SLA Provide root cause analysis for S@R, medium/high severity issues, ensure all follow up action points are carried out Responsible for the stability of the production system. Direct second and third level of support for problem diagnosis and resolution as per the agreed SLA's. Responsible for managing the production related changes, releases and rollouts with zero or minimal impact to the stability of the application. Review the dependent changes of the surround systems, infrastructure, networking etc... Responsible for ensuring proper technical plans are in place for all production changes (e.g. fallback plan, implementation plan, data conversion etc...) Create and update Production Support documentation, contingency (DR/BCP) documentation and processes. Provide inputs to PSS manager for monthly dashboard that provide information on incident and problem trends along with SIP and RCA Action Items. Participate & support in cross-training and knowledge transfer activities within support teams Key Responsibilities Strategy To be accountable to execute the strategy devised for the business unit Business Fully accountable in incident, problem, change and risk management which relates to the production application/system. Processes Create, Review and update Production Support documentation. Update of contingency (DR/BCP) documentation and processes People & Talent Participate in cross-training and knowledge transfer activities within support teams Risk Management Responsible to proactively identify the risks in the application and manage the mitigation actions. Responsible for managing, tracking and timely closure of risks and other compliance related issues in Riskwise (Information Security risks) & M7 (Operational Risks). Governance Provide inputs to management for monthly dashboard that provide information on incident and problem trends along with SIP and RCA Action Items. Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Lead to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] * Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Key stakeholders SRE PO Hive PO SRE Lead Business Stakeholders CIO counterparts Other Responsibilities Embed Here for good and Group’s brand and values in the Support team; Perform other responsibilities assigned under Group, Country, Business or Functional policies and procedures; Multiple functions (double hats); Skills and Experience AWS Oracle Linux Kubernetes API Qualifications AWS Certification SRE Certification ITIL Certification (Good to have) Competencies Action Oriented Collaborates Customer Focus Gives Clarity & Guidance Manages Ambiguity Develops Talent Drives Vision & Purpose Nimble Learning Decision Quality Courage Instills Trust Strategic Mindset Technical Competencies: This is a generic competency to evaluate candidate on role-specific technical skills and requirements About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together we: Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What we offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Recruitment Assessments Some of our roles use assessments to help us understand how suitable you are for the role you've applied to. If you are invited to take an assessment, this is great news. It means your application has progressed to an important stage of our recruitment process. Visit our careers website www.sc.com/careers www.sc.com/careers

Posted 1 week ago

Apply

2.0 - 4.0 years

2 - 8 Lacs

Noida

On-site

GlassDoor logo

Expertise in AWS services like EC2, CloudFormation, S3, IAM, SNS, SQS, EMR, Athena, Glue, lake formation etc. Expertise in Hadoop/EMR/DataBricks with good debugging skills to resolve hive and spark related issues. Sound fundamentals of database concepts and experience with relational or non-relational database types such as SQL, Key-Value, Graphs etc. Experience in infrastructure provisioning using CloudFormation, Terraform, Ansible, etc. Experience in programming languages such as Python/PySpark. Excellent written and verbal communication skills. Key Responsibilities Working closely with the Data lake engineers to provide technical guidance, consultation and resolution of their queries. Assist in development of simple and advanced analytics best practices, processes, technology & solution patterns and automation (including CI/CD) Working closely with various stakeholders in US team with a collaborative approach. Develop data pipeline in python/pyspark to be executed in AWS cloud. Set up analytics infrastructure in AWS using cloud formation templates. Develop mini/micro batch, streaming ingestion patterns using Kinesis/Kafka. Seamlessly upgrading the application to higher version like Spark/EMR upgrade. Participates in the code reviews of the developed modules and applications. Provides inputs for formulation of best practices for ETL processes / jobs written in programming languages such as PySpak and BI processes. Working with column-oriented data storage formats such as Parquet , interactive query service such as Athena and event-driven computing cloud service - Lambda Performing R&D with respect to the latest and greatest Big data in the market, perform comparative analysis and provides recommendations to choose the best tool as per the current and future needs of the enterprise. Required Qualifications Bachelors or Masters degree in Computer Science or similar field 2-4 years of strong experience in big data development Expertise in AWS services like EC2, CloudFormation, S3, IAM, SNS, SQS, EMR, Athena, Glue, lake formation etc. Expertise in Hadoop/EMR/DataBricks with good debugging skills to resolve hive and spark related issues. Sound fundamentals of database concepts and experience with relational or non-relational database types such as SQL, Key-Value, Graphs etc. Experience in infrastructure provisioning using CloudFormation, Terraform, Ansible, etc. Experience in programming languages such as Python/PySpark. Excellent written and verbal communication skills. Preferred Qualifications Cloud certification (AWS, Azure or GCP) About Our Company Ameriprise India LLP has been providing client based financial solutions to help clients plan and achieve their financial objectives for 125 years. We are a U.S. based financial planning company headquartered in Minneapolis with a global presence. The firm’s focus areas include Asset Management and Advice, Retirement Planning and Insurance Protection. Be part of an inclusive, collaborative culture that rewards you for your contributions and work with other talented individuals who share your passion for doing great work. You’ll also have plenty of opportunities to make your mark at the office and a difference in your community. So if you're talented, driven and want to work for a strong ethical company that cares, take the next step and create a career at Ameriprise India LLP. Ameriprise India LLP is an equal opportunity employer. We consider all qualified applicants without regard to race, color, religion, sex, genetic information, age, sexual orientation, gender identity, disability, veteran status, marital status, family status or any other basis prohibited by law. Full-Time/Part-Time Full time Timings (2:00p-10:30p) India Business Unit AWMPO AWMP&S President's Office Job Family Group Technology

Posted 1 week ago

Apply

4.0 - 9.0 years

17 - 27 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities • Experience with big data technologies (Hadoop, Spark, Hive) • Proven experience as a development data engineer or similar role, with ETL background. • Experience with data integration / ETL best practices and data quality principles. • Play a crucial role in ensuring the quality and reliability of the data by designing, implementing, and executing comprehensive testing. • By going over the User Stories build the comprehensive code base and business rules for testing and validation of the data. • Knowledge of continuous integration and continuous deployment (CI/CD) pipelines. • Familiarity with Agile/Scrum development methodologies. • Excellent analytical and problem-solving skills. • Strong communication and collaboration skills.

Posted 1 week ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About PhonePe Group: PhonePe is India’s leading digital payments company with 50 crore (500 Million) registered users and 3.7 crore (37 Million) merchants covering over 99% of the postal codes across India. On the back of its leadership in digital payments, PhonePe has expanded into financial services (Insurance, Mutual Funds, Stock Broking, and Lending) as well as adjacent tech-enabled businesses such as Pincode for hyperlocal shopping and Indus App Store which is India's first localized App Store. The PhonePe Group is a portfolio of businesses aligned with the company's vision to offer every Indian an equal opportunity to accelerate their progress by unlocking the flow of money and access to services. Culture At PhonePe, we take extra care to make sure you give your best at work, Everyday! And creating the right environment for you is just one of the things we do. We empower people and trust them to do the right thing. Here, you own your work from start to finish, right from day one. Being enthusiastic about tech is a big part of being at PhonePe. If you like building technology that impacts millions, ideating with some of the best minds in the country and executing on your dreams with purpose and speed, join us! Responsibility Diagnosing, troubleshooting and fixing production software issues Developing and owning monitoring solutions for production and non-production environments and applications Develop smaller complexity features / enhancements in existing stable system components Adapt installer, shell scripts and perl scripts and aggressively automate manual tasks through scripting. Provide feedback from customer users to the team. Maintain on-going record of problem analysis and resolution activity in an on-call tracking system. Desired Skills Good exposure to MySQL and writing SQL queries. Should be able to write and understand complex joins. Automation scripting: knowledge of scripting language Ex: Python, perl, shell script is required. Proven success in fast paced production support environment. Programming experience in any language (C++ / Java / RubyOnRails will be a plus). Good in problem solving and analytical skills. Demonstrated ability to communicate effectively in writing, ideally with a successful track record of responding to/resolving customer issues through written communication Excellent written and oral communications skills. Prior experience in e-commerce or web services domain will be a plus. Ability to effectively work with cross functional team. Possess ability to understand business problems and provide solutions/informations related to the business requirements. Exposure to Hadoop, Hive queries will be added advantage. Experience: 2+ Year PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe on our blog. Life at PhonePe PhonePe in the news Show more Show less

Posted 1 week ago

Apply

Exploring Hive Jobs in India

Hive is a popular data warehousing tool used for querying and managing large datasets in distributed storage. In India, the demand for professionals with expertise in Hive is on the rise, with many organizations looking to hire skilled individuals for various roles related to data processing and analysis.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Delhi

These cities are known for their thriving tech industries and offer numerous opportunities for professionals looking to work with Hive.

Average Salary Range

The average salary range for Hive professionals in India varies based on experience level. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.

Career Path

Typically, a career in Hive progresses from roles such as Junior Developer or Data Analyst to Senior Developer, Tech Lead, and eventually Architect or Data Engineer. Continuous learning and hands-on experience with Hive are crucial for advancing in this field.

Related Skills

Apart from expertise in Hive, professionals in this field are often expected to have knowledge of SQL, Hadoop, data modeling, ETL processes, and data visualization tools like Tableau or Power BI.

Interview Questions

  • What is Hive and how does it differ from traditional databases? (basic)
  • Explain the difference between HiveQL and SQL. (medium)
  • How do you optimize Hive queries for better performance? (advanced)
  • What are the different types of tables supported in Hive? (basic)
  • Can you explain the concept of partitioning in Hive tables? (medium)
  • What is the significance of metastore in Hive? (basic)
  • How does Hive handle schema evolution? (advanced)
  • Explain the use of SerDe in Hive. (medium)
  • What are the various file formats supported by Hive? (basic)
  • How do you troubleshoot performance issues in Hive queries? (advanced)
  • Describe the process of joining tables in Hive. (medium)
  • What is dynamic partitioning in Hive and when is it used? (advanced)
  • How can you schedule jobs in Hive? (medium)
  • Discuss the differences between bucketing and partitioning in Hive. (advanced)
  • How do you handle null values in Hive? (basic)
  • Explain the role of the Hive execution engine in query processing. (medium)
  • Can you give an example of a complex Hive query you have written? (advanced)
  • What is the purpose of the Hive metastore? (basic)
  • How does Hive support ACID transactions? (medium)
  • Discuss the advantages and disadvantages of using Hive for data processing. (advanced)
  • How do you secure data in Hive? (medium)
  • What are the limitations of Hive? (basic)
  • Explain the concept of bucketing in Hive and when it is used. (medium)
  • How do you handle schema evolution in Hive? (advanced)
  • Discuss the role of Hive in the Hadoop ecosystem. (basic)

Closing Remark

As you explore job opportunities in the field of Hive in India, remember to showcase your expertise and passion for data processing and analysis. Prepare well for interviews by honing your skills and staying updated with the latest trends in the industry. Best of luck in your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies